Skip to content

Commit

Permalink
Merge pull request #19 from doug-martin/master
Browse files Browse the repository at this point in the history
v0.1.0
  • Loading branch information
doug-martin committed Mar 5, 2014
2 parents 45897fa + 4401b0a commit ebb6881
Show file tree
Hide file tree
Showing 27 changed files with 1,171,631 additions and 1,454 deletions.
1 change: 0 additions & 1 deletion .travis.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
language: node_js
node_js:
- 0.8
- 0.1
before_script:
- npm install -g grunt-cli
267 changes: 237 additions & 30 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,35 +13,93 @@ edge cases such as multi line rows. However it does support escaped values, embe

## Usage

To parse a file.
### Parsing

All methods accept the following `options`

* `headers=false`: Ste to true if you expect the first line of your `CSV` to contain headers, alternatly you can specify an array of headers to use.
* `ignoreEmpty=false`: If you wish to ignore empty rows.
* `delimiter=','`: If your data uses an alternate delimiter such as `;` or `\t`.
* **NOTE** When specifying an alternate `delimiter` you may only pass in a single character delimeter

**events**

`parse-error`: Emitted if there was an error parsing a row.
`record`: Emitted when a record is parsed.
`data-invalid`: Emitted if there was invalid row encounted, **only emitted if the `validate` function is used**.
`data`: Emitted with the `stringified` version of a record.

**([options])**

If you use `fast-csv` as a function it returns a transform stream that can be piped into.

```javascript
var stream = fs.createReadStream("my.csv");

var csvStream = csv()
.on("record", function(data){
console.log(data):
})
.on("end", function(){
console.log("done");
});

stream.pipe(csvStream);
```

**`.fromPath(path[, options])**

This method parses a file from the specified path.

```javascript
var csv = require("fast-csv");

csv("my.csv")
.on("data", function(data){
csv
.fromPath("my.csv")
.on("record", function(data){
console.log(data):
})
.on("end", function(){
console.log("done");
});
```

**`.fromString(string[, options])**

This method parses a string

```javascript
var csv = require("fast-csv");

var CSV_STRING = 'a,b\n' +
'a1,b1\n' +
'a2,b2\n';

csv
.fromPath(CSV_STRING, {headers: true})
.on("record", function(data){
console.log(data):
})
.parse();
.on("end", function(){
console.log("done");
});
```

You may also parse a stream.
**`.fromStream(stream[, options])**

This accepted a readable stream to parse data from.

```javascript
var stream = fs.createReadStream("my.csv");

csv(stream)
.on("data", function(data){
csv()
.fromStream(stream)
.on("record", function(data){
console.log(data):
})
.on("end", function(){
console.log("done");
})
.parse();

});
```

If you expect the first line your csv to headers you may pass a headers option in. Setting the headers option will
Expand All @@ -50,14 +108,14 @@ cause change each row to an object rather than an array.
```javascript
var stream = fs.createReadStream("my.csv");

csv(stream, {headers : true})
.on("data", function(data){
csv()
.fromStream(stream, {headers : true})
.on("record", function(data){
console.log(data):
})
.on("end", function(){
console.log("done");
})
.parse();
});

```

Expand All @@ -67,14 +125,14 @@ the data columns will not match.
```javascript
var stream = fs.createReadStream("my.csv");

csv(stream, {headers : ["firstName", "lastName", "address"]})
.on("data", function(data){
csv
.fromStream(stream, {headers : ["firstName", "lastName", "address"]})
.on("record", function(data){
console.log(data):
})
.on("end", function(){
console.log("done");
})
.parse();
});

```

Expand All @@ -86,14 +144,14 @@ Any rows consisting of nothing but empty strings and/or commas will be skipped,
```javascript
var stream = fs.createReadStream("my.csv");

csv(stream, {ignoreEmpty: true})
.on("data", function(data){
csv
.fromStream(stream, {ignoreEmpty: true})
.on("record", function(data){
console.log(data):
})
.on("end", function(){
console.log("done");
})
.parse();
});

```

Expand All @@ -105,20 +163,20 @@ will be emitted with the row and the index.
```javascript
var stream = fs.createReadStream("my.csv");

csv(stream, {headers : true})
csv(
.fromStream(stream, {headers : true})
.validate(function(data){
return data.age < 50; //all persons must be under the age of 50
})
.on("data-invalid", function(data){
//do something with invalid row
})
.on("data", function(data){
.on("record", function(data){
console.log(data):
})
.on("end", function(){
console.log("done");
})
.parse();
});

```
Expand All @@ -130,18 +188,153 @@ be provided to validate and emitted as a row.
```javascript
var stream = fs.createReadStream("my.csv");

csv(stream)
csv
.fromStream(stream)
.transform(function(data){
return data.reverse(); //reverse each row.
})
.on("data", function(data){
.on("record", function(data){
console.log(data):
})
.on("end", function(){
console.log("done");
})
.parse();
});

```
### Formatting
`fast-csv` also allows to you to create create a `CSV` from data.
In addition to the options for parsing you can specify the following additional options.
* `quote='"'`: The character to use to escape values that contain a delimeter.
* `escape='"'`: The character to use when escaping a value that is `quoted` and constains a `quote` character.
* `i.e`: 'First,"Name"' => '"First,""name"""'
**Writing Data**
Each of the following methods accept an array of values to be written, however each value must be an `array` of `array`s or `object`s.
**`write(arr[, options])`**
Create a readable stream to read data from.
```javascript
var ws = fs.createWritableStream("my.csv");
csv
.write([
["a", "b"],
["a1", "b1"],
["a2", "b2"]
], {headers: true})
.pipe(ws);
```
```javascript
var ws = fs.createWritableStream("my.csv");
csv
.write([
{a: "a1", b: "b1"},
{a: "a2", b: "b2"}
], {headers: true})
.pipe(ws);
```
**`writeToStream(stream,arr[, options])`**
Write an array of values to a `WritableStream`
```javascript
csv
.writeToStream(fs.createWritableStream("my.csv"), [
["a", "b"],
["a1", "b1"],
["a2", "b2"]
], {headers: true});
```
```javascript
csv
.writeToStream(fs.createWritableStream("my.csv"), [
{a: "a1", b: "b1"},
{a: "a2", b: "b2"}
], {headers: true})
.pipe(ws);
```
**`writeToPath(arr[, options])`**
Write an array of values to the specified path
```javascript
csv
.writeToPath("my.csv", [
["a", "b"],
["a1", "b1"],
["a2", "b2"]
], {headers: true})
.on("finish", function(){
console.log("done!");
});
```
```javascript
csv
.writeToStream("my.csv", [
{a: "a1", b: "b1"},
{a: "a2", b: "b2"}
], {headers: true})
.on("finish", function(){
console.log("done!");
});
```
**`writeToString(arr[, options])`**
```javascript
csv.writeToString([
["a", "b"],
["a1", "b1"],
["a2", "b2"]
], {headers: true}); //"a,b\na1,b1\na2,b2\n"
```
```javascript
csv.writeToString([
{a: "a1", b: "b1"},
{a: "a2", b: "b2"}
], {headers: true}); //"a,b\na1,b1\na2,b2\n"
```
## Benchmarks
`Parsing 20000 records AVG over 3 runs`
```
fast-csv: 198.67ms
csv: 525.33ms
```
`Parsing 50000 records AVG over 3 runs`
```
fast-csv: 441.33ms
csv: 1291ms
```
`Parsing 100000 records AVG over 3 runs`
```
fast-csv: 866ms
csv: 2773.33ms
```
`Parsing 1000000 records AVG over 3 runs`
```
fast-csv: 8562.67ms
csv: 30030.67ms
```
## License
Expand All @@ -153,5 +346,19 @@ MIT <https://github.com/C2FO/fast-csv/raw/master/LICENSE>
* Website: <http://c2fo.com>
* Twitter: [http://twitter.com/c2fo](http://twitter.com/c2fo) - 877.465.4045

##Namespaces





##Classes









Loading

0 comments on commit ebb6881

Please sign in to comment.