- Curse Premium
Member for 17 years, 4 months, and 10 days
Last active Sat, Dec, 19 2020 21:28:54
- 0 Followers
- 464 Total Posts
- 0 Thanks
May 30, 2008Well, the issue is only triggered by DecompressHuffman. DecompressLZW does not cause it. I can't imagine what sort of "general problem" I could be causing -- my code is only doing exactly what you've done, using the same data. If the issue exists in AceSerialize or LibCompress, that's out of my hands to deal with. I suppose there could be something funky going on even further up the call stack, like in CallbackHandler or AceComm-3.0, which are the other libraries involved here. You probably need to replicate my entire call stack to repro the issue. Well, I don't want to mess with it -- I just won't use LibCompress. I appreciate you looking into it!Posted in: Libraries
May 30, 2008You can get AceSerializer-3.0 by going to files.wowace.com and downloading the Ace3 package. Bring the library into your add-on by putting it in a Libs folder and adding Libs\AceSerializer-3.0\AceSerializer-3.0.xml to your toc (or to embeds.xml). From there, add the mixins to your add-on object like so:Posted in: Libraries
After that, it's easy to serialize and deserialize an object:
local serializedData = self:Serialize(someData) someData = self:Deserialize(serializedData)
Note that your add-on doesn't need to use Ace3 to embed this library as shown.
May 29, 2008Galmok, I haven't quite gotten to emailing you yet, but I was planning on sending you the data I was compressing. It sounds like you've reproduced the issue.Posted in: Libraries
I'm not sure if there is a limit to the number of keys in the table, but I would think it would be extremely large if there is. Could you perhaps be seeing a stack overflow?
May 27, 2008Has anyone been using this lately? Is it working well? I'm working on a new add-on which needs to send a fairly large amount of data between users (about 20-30K). This takes a long time to send using AceComm-3.0 due to the message size limit. I've brought in LibCompress, which is able to compress the data to an average of about 65% of the original size. The data is actually a table which has been sent through AceSerialize-3.0. The compression appears to be working, but decompression yields an error on line 522 (in the DecompressHuffman function):Posted in: Libraries
[2008/05/26 20:56:22-118-x4]: memory allocation error: block too big:
<in C code>: ?
QuestAgent-73719\Libs\LibCompress\LibCompress.lua:522: in function <...e\AddOns\QuestAgent\Libs\LibCompress\LibCompress.lua:428>
(tail call): ?:
QuestAgent-73719\QuestAgent.lua:60: in function `?'
CallbackHandler-1.0\CallbackHandler-1.0.lua:146: in function <...edia-3.0\CallbackHandler-1.0\CallbackHandler-1.0.lua:146>
<string>:"safecall Dispatcher":4: in function <[string "safecall Dispatcher"]:4>
<in C code>: ?
<string>:"safecall Dispatcher":13: in function `?'
CallbackHandler-1.0\CallbackHandler-1.0.lua:91: in function `Fire'
AceComm-3.0\AceComm-3.0.lua:180: in function `aceCommReassemblerFunc'
AceComm-3.0\AceComm-3.0.lua:243: in function <...terface\AddOns\Omen\Libs\AceComm-3.0\AceComm-3.0.lua:235>
I tried switching to just CompressLZW instead of Compress, and everything works just fine (but the LZW algorithm compresses it far less than Huffman did). Is anyone still working on this? I can provide more code for testing if needed.
Also: did the Huffman algorithm here ever get updated to escape \000 characters?
May 7, 2008I've also asked for SVN access (twice), with no reply. Are we being ignored? I totally understand if everyone is just slammed. People are asking me to get my add-on into the SVN, but no luck getting any reply from the admins. :)Posted in: Lua Code Discussion
- To post a comment, please login or register a new account.