• 0

    posted a message on LibCompress
    Well, the issue is only triggered by DecompressHuffman. DecompressLZW does not cause it. I can't imagine what sort of "general problem" I could be causing -- my code is only doing exactly what you've done, using the same data. If the issue exists in AceSerialize or LibCompress, that's out of my hands to deal with. I suppose there could be something funky going on even further up the call stack, like in CallbackHandler or AceComm-3.0, which are the other libraries involved here. You probably need to replicate my entire call stack to repro the issue. Well, I don't want to mess with it -- I just won't use LibCompress. I appreciate you looking into it!
    Posted in: Libraries
  • 0

    posted a message on LibCompress
    You can get AceSerializer-3.0 by going to files.wowace.com and downloading the Ace3 package. Bring the library into your add-on by putting it in a Libs folder and adding Libs\AceSerializer-3.0\AceSerializer-3.0.xml to your toc (or to embeds.xml). From there, add the mixins to your add-on object like so:

    LibStub("AceSerializer-3.0"):Embed(MyAddOnObject)


    After that, it's easy to serialize and deserialize an object:

    local serializedData = self:Serialize(someData)
    someData = self:Deserialize(serializedData)


    Note that your add-on doesn't need to use Ace3 to embed this library as shown.
    Posted in: Libraries
  • 0

    posted a message on LibCompress
    I emailed you with the data. If that doesn't help, I can send you the entire add-on for testing.
    Posted in: Libraries
  • 0

    posted a message on LibCompress
    I'll repro the issue and send you the string that caused it. However, with the high memory churn and processor usage, I might not use this library at all for this add-on.
    Posted in: Libraries
  • 0

    posted a message on LibCompress
    It was approx. 20K (20,000 bytes) uncompressed, and about 16.5K compressed. Definitely nothing that should be causing out of memory issues.
    Posted in: Libraries
  • 0

    posted a message on LibCompress
    Galmok, I haven't quite gotten to emailing you yet, but I was planning on sending you the data I was compressing. It sounds like you've reproduced the issue.

    I'm not sure if there is a limit to the number of keys in the table, but I would think it would be extremely large if there is. Could you perhaps be seeing a stack overflow?
    Posted in: Libraries
  • 0

    posted a message on LibCompress
    Thanks, jjsheets. Maybe galmok will see this. I'll see if there's any troubleshooting I can do on the code, but the algorithm is a bit dense.
    Posted in: Libraries
  • 0

    posted a message on LibCompress
    Has anyone been using this lately? Is it working well? I'm working on a new add-on which needs to send a fairly large amount of data between users (about 20-30K). This takes a long time to send using AceComm-3.0 due to the message size limit. I've brought in LibCompress, which is able to compress the data to an average of about 65% of the original size. The data is actually a table which has been sent through AceSerialize-3.0. The compression appears to be working, but decompression yields an error on line 522 (in the DecompressHuffman function):

    [2008/05/26 20:56:22-118-x4]: memory allocation error: block too big:
    <in C code>: ?
    QuestAgent-73719\Libs\LibCompress\LibCompress.lua:522: in function <...e\AddOns\QuestAgent\Libs\LibCompress\LibCompress.lua:428>
    (tail call): ?:
    QuestAgent-73719\QuestAgent.lua:60: in function `?'
    CallbackHandler-1.0\CallbackHandler-1.0.lua:146: in function <...edia-3.0\CallbackHandler-1.0\CallbackHandler-1.0.lua:146>
    <string>:"safecall Dispatcher[4]":4: in function <[string "safecall Dispatcher[4]"]:4>
    <in C code>: ?
    <string>:"safecall Dispatcher[4]":13: in function `?'
    CallbackHandler-1.0\CallbackHandler-1.0.lua:91: in function `Fire'
    AceComm-3.0\AceComm-3.0.lua:180: in function `aceCommReassemblerFunc'
    AceComm-3.0\AceComm-3.0.lua:243: in function <...terface\AddOns\Omen\Libs\AceComm-3.0\AceComm-3.0.lua:235>


    I tried switching to just CompressLZW instead of Compress, and everything works just fine (but the LZW algorithm compresses it far less than Huffman did). Is anyone still working on this? I can provide more code for testing if needed.

    Also: did the Huffman algorithm here ever get updated to escape \000 characters?
    Posted in: Libraries
  • 0

    posted a message on What's needed to use the SVN?
    I've also asked for SVN access (twice), with no reply. Are we being ignored? I totally understand if everyone is just slammed. People are asking me to get my add-on into the SVN, but no luck getting any reply from the admins. :)
    Posted in: Lua Code Discussion
  • To post a comment, please or register a new account.