What is a nodejs streams?
Streams are collections of data โ just like arrays or strings. The difference is that streams might not be available all at once, and they donโt have to fit in memory. This makes streams really powerful when working with large amounts of data, or data thatโs coming from an external source one chunk at a time.
If you want to write to a file without having to read from it and rewrite the data again and again this would be the solution. you can use appendFile, which creates a new file handle each time it’s Asynchronously
appendFile
1
2
3
4
5
6
7
|
const fs = require('fs');
fs.appendFile('append.csv', 'data to append', function (err) {
if (err) throw err;
console.log('loged!');
});
|
or
1
2
3
4
5
6
7
8
9
10
|
const fs = require('fs');
Var data = "\nThis is soooo cool like css-trick"
fs.appendFile('learn.txt', 'data', function (err) {
if (err) throw err;
console.log('data was loged!');
});
//output:
// node learnnode.js
// data was loged!
|
now if you check you file you should be able to see the data printed and it can appen as many as you want however if your dealing with a large chunk of data .. i suggest you use WriteStream
instead
Synchronously:
1
2
3
|
const fs = require('fs');
fs.appendFileSync('message.txt', 'data to append');
|
WriteStream
WriteStream comes in hand if you append repeatedly to the same file, it’s much better use it to avoid FATAL ERROR
erors that will eat up your memory and crash depending on the number you can append on one the file.
example using appendFile
1
2
3
4
5
6
7
8
9
|
const fs = require('fs');
console.log('start');
[...Array(1000000)].forEach(function (item, index) {
fs.appendFile("append.txt", index + "\n", function (err) {
if (err) console.log(err);
});
});
console.log('end');
|
it will append to the file for 1000000
times but when you increrase it to 10,000000
then you will end up with FATAL ERROR
1
|
FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
|
example using WriteStream
1
2
3
4
5
6
7
8
9
|
const fs = require('fs');
var stream = fs.createWriteStream("append.txt", {flags:'a'});
console.log('start');
[...Array(10000000)].forEach(function (item, index) {
stream.write(index + "\n");
});
console.log('end');
stream.end();
|
Now using WriteStream was able to append on the file for 100,000000
time and for 1000 000 000
it crashed to
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
|
{
"header": {
"reportVersion": 1,
"event": "Allocation failed - JavaScript heap out of memory",
"trigger": "FatalError",
"filename": "report.20200810.224131.64549.0.001.json",
"dumpEventTime": "2020-08-10T22:41:31Z",
"dumpEventTimeStamp": "1597113691942",
"processId": 64549,
"cwd": "mypath/project",
}
}
```
you can play with it by increasing the number since this depends on your Pc
|