Click here to Skip to main content
15,868,016 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
I hope that I can describe my problem in a good way...

What I want is to read file content into an array. And therefore I call push(e2.target.result) - but if files are to big, nothing is pushed, and I cant use the content in a later process :-(
So how to push larger files into an array?

What I have tried:

function DropFiles(event) {
    event.preventDefault();
    var vVoucherArray = [];
    var vFileNameArray = [];
    var vFileContentArray = [];
    voucherNo = event.target.id.substring(4, event.target.id.length);
    if (event.dataTransfer.items) {
        for (var i = 0; i < event.dataTransfer.items.length; i++) {
            if (event.dataTransfer.items[i].kind === 'file') {
                const file = event.dataTransfer.items[i].getAsFile();
                readFile(file, function (e2) {
                    // use result in callback...
                    vVoucherArray.push(voucherNo);
                    vFileNameArray.push(file.name);
                    vFileContentArray.push(e2.target.result);
                });
            }
        }
        doUploadFile(vVoucherArray, vFileNameArray, vFileContentArray);
    }
    else {
        for (var i = 0; i < event.dataTransfer.files.length; i++) {
            const file = event.dataTransfer.files[i];
            readFile(file, function (e2) {
                // use result in callback...
                vVoucherArray.push(voucherNo);
                vFileNameArray.push(file.name);
                vFileContentArray.push(e2.target.result);
            });
        }
        doUploadFile(vVoucherArray, vFileNameArray, vFileContentArray);
    }
}
Posted
Updated 15-Aug-22 22:19pm

1 solution

Why do you need to store the file contents in an array? Can you not just call doUploadFile() for each file being uploaded? The reason I ask is because storing file content in an array is going to use up memory (RAM) on the client's machine. If a user uploads a 2GB document, you're asking the browser to store that 2GB document in memory which isn't a very good idea. For larger files you'd be better off streaming them to the back-end, which means using something called "chunking" which is where the browser takes parts of the large document and uploads piece-by-piece to avoid loading the whole thing into RAM.

Here's an article with a good example of how to chunk files[^]

TL;DR - You shouldn't expect to be able to load entire documents into memory in JS, the browser might stop this to prevent using up too much memory.
 
Share this answer
 
Comments
MichaelEriksen 16-Aug-22 4:25am    
Thanks for the answer - but I try to upload all files at once since I need to give a total list of invalid filetypes to the user. The files are not very big - but it seems that they are to big for the process.
Chris Copeland 16-Aug-22 4:39am    
If the files aren't too large (ie. less than 2MB) you might want to also consider debugging the code and seeing what these variables are producing. I don't know what the readFile() method is but it's always worth having a check to see if any console errors are being produced and debugging it to see what the values of each variable are. Could just be a permissions issue?

But if there is a chance users could be uploading larger files then it's definitely worth looking into an alternative way of handling this, even if it's just adding a check to prevent too-large files from being uploaded (Blob.size[^])

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900