Actual Stream constructor wrapped the the main exported function
Adds a value to the end of a Stream.
Applies results from a Stream as arguments to a function
Groups all values into an Array and passes down the stream as a single data event. This is a bit like doing toArray, but instead of accepting a callback and causing a thunk, it passes the value on.
Filters a Stream to drop all non-truthy values.
Concatenates a Stream to the end of this Stream.
Be aware that in the top-level export, the args may be in the reverse
order to what you'd expect _([a], [b]) => [b, a]
, as this follows the
convention of other top-level exported functions which do x
to y
.
Consumes values from a Stream (once resumed) and returns a new Stream for you to optionally push values onto using the provided push / next functions.
This function forms the basis of many higher-level Stream operations. It will not cause a paused stream to immediately resume, but behaves more like a 'through' stream, handling values as they are read.
Holds off pushing data events downstream until there has been no more
data for ms
milliseconds. Sends the last value that occurred before
the delay, discarding all other values.
Destroys a stream by unlinking it from any consumers and sources. This will stop all consumers from receiving events from this stream and removes this stream as a consumer of any source stream.
This function calls end() on the stream and unlinks it from any piped-to streams.
Iterates over every value from the Stream, calling the iterator function on each of them. This function causes a thunk.
If an error from the Stream reaches the each
call, it will emit an
error event (which will cause it to throw if unhandled).
boolean
Ends a Stream. This is the same as sending a nil value as data. You shouldn't need to call this directly, rather it will be called by any Node Readable Streams you pipe in.
Extracts errors from a Stream and applies them to an error handler
function. Returns a new Stream with the errors removed (unless the error
handler chooses to rethrow them using push
). Errors can also be
transformed and put back onto the Stream as values.
Creates a new Stream including only the values which pass a truth test.
A convenient form of filter, which returns the first object from a Stream that passes the provided truth test
Filters using a predicate which returns a Stream. If you need to check against an asynchronous data source when filtering a Stream, this can be convenient. The Stream returned from the filter function should have a Boolean as it's first value (all other values on the Stream will be disregarded).
Creates a new Stream of values by applying each item in a Stream to an iterator function which may return a Stream. Each item on these result Streams are then emitted on a single output Stream.
The same as calling stream.map(f).flatten()
.
Recursively reads values from a Stream which may contain nested Streams or Arrays. As values or errors are encountered, they are emitted on a single output Stream.
Forks a stream, allowing you to add additional consumers with shared back-pressure. A stream forked to multiple consumers will only pull values from it's source as fast as the slowest consumer can handle them.
A convenient form of reduce, which groups items based on a function or property name
toString() is called on the result of a function.
Creates a new Stream with only the first value from the source.
Calls a named method on each object from the Stream - returning a new stream with the result of those calls.
Drops all values from the Stream apart from the last one (if any).
Creates a new Stream, which when read from, only returns the last seen value from the source. The source stream does not experience back-pressure. Useful if you're using a Stream to model a changing property which you need to query periodically.
Array<Function>
Creates a new Stream of transformed values by applying a function to each value from the source. The transformation function can be replaced with a non-function value for convenience, and it will emit that value for every data event on the source Stream.
Takes a Stream of Streams and merges their values and errors into a single new Stream. The merged stream ends when all source streams have ended.
Note that no guarantee is made with respect to the order in which values for each stream end up in the merged stream. Values in the merged stream will, however, respect the order they were emitted from their respective streams.
Observes a stream, allowing you to handle values as they are emitted, without adding back-pressure or causing data to be pulled from the source. This can be useful when you are performing two related queries on a stream where one would block the other. Just be aware that a slow observer could fill up it's buffer and cause memory issues. Where possible, you should use fork.
Switches source to an alternate Stream if the current Stream is empty.
Takes a Stream of Streams and reads from them in parallel, buffering the results until they can be returned to the consumer in their original order.
Pauses the stream. All Highland Streams start in the paused state.
Pipes a Highland Stream to a Node Writable Stream (Highland Streams are also Node Writable Streams). This will pull all the data from the source Highland Stream and write it to the destination, automatically managing flow so that the destination is not overwhelmed by a fast source.
This function returns the destination so you can chain together pipe calls.
Retrieves values associated with a given property from all elements in the collection.
Consumes a single item from the Stream. Unlike consume, this function will not provide a new stream for you to push values onto, and it will unsubscribe as soon as it has a single error, value or nil from the source.
You probably won't need to use this directly, but it is used internally by some functions in the Highland library.
Boils down a Stream to a single value. The memo is the initial state of the reduction, and each successive step of it should be returned by the iterator function. The iterator is passed two arguments: the memo and the next value.
Same as reduce, but uses the first element as the initial
state instead of passing in a memo
value.
The inverse of filter.
Resumes a paused Stream. This will either read from the Stream's incoming buffer or request more data from an upstream source.
Like reduce, but emits each intermediate value of the reduction as it is calculated.
Same as scan, but uses the first element as the initial
state instead of passing in a memo
value.
Reads values from a Stream of Streams, emitting them on a Single output Stream. This can be thought of as a flatten, just one level deep. Often used for resolving asynchronous actions such as a HTTP request or reading a file.
An alias for the sequence method.
Like the errors method, but emits a Stream end marker after an Error is encountered.
Creates a new Stream with the first n
values from the source.
Ensures that only one data event is push downstream (or into the buffer)
every ms
milliseconds, any other values are dropped.
Collects all values from a Stream into an Array and calls a function with once with the result. This function causes a thunk.
If an error from the Stream reaches the toArray
call, it will emit an
error event (which will cause it to throw if unhandled).
A convenient form of filter, which returns all objects from a Stream match a set of property values.
Writes a value to the Stream. If the Stream is paused it will go into the Stream's incoming buffer, otherwise it will be immediately processed and sent to the Stream's consumers (if any). Returns false if the Stream is paused, true otherwise. This lets Node's pipe method handle back-pressure.
You shouldn't need to call this yourself, but it may be called by Node functions which treat Highland Streams as a Node Writable Stream.
boolean
Takes two Streams and returns a Stream of corresponding pairs.