JSON to CSV conversion tool.
npm install easy-csvsh
$npm install easy-csv
`
Then use the module either using Node conventional err first callbacks, or using bluebird.js promises:
`js
//using callbacks
let CSV = require('easy-csv');
let data = [{some: "data", number: 1}];
CSV.toJSON(data, function(err, csv) {
if(err) { //do something with any errors }
else{ console.log(csv) } // some,number
// data,1
});
//using promises
let CSV = require('easy-csv');
let data = [{some: "data", number: 1}];
CSV.toCSV(data)
.then(csv => {
//some, number
//data, 1
})
.catch(err => {
//do something with the error
});
`
$3
The available options are easy to use and should be fairly straight-forward
* delimiter: The character that will be inserted between JSON values in a CSV row. Default: ','.
* endline: The character(s) that will be inserted to denote the end of the CSV row. Default: '\n' .
* fields: The Object keys used that pertain to the JSON values. Default: The object keys from the first index.
* fieldNames: The header names to be used in the first CSV row. Default: Same as the fields array.
Options example:
`js
let options = {
delimiter: '|', //will use pipes between values within CSV row
endline: '\n\r', //will end all CSV rows (except last) with \n\r instead of \n
fields: ['some', 'number'], //will only insert values matching to attributes 'some' and 'number'
fieldNames: ['First', 'Second'] //will make CSV headers be 'First,Second'
};
`
$3
A handful of tests exist for 'happy path' uses. They can be run from the command line by navigating to the module directory and running:
`sh
$ npm run test
``