Asynchronously read multiple same-type data objects from an input stream with optional filtering
The algorithm assumes that the input stream on its top level consists exclusively of one or more serial objects of type TRoot.
There are two flavors of this template:
Usage:
{
}
{
}
for_each (it, eos, [](
CSeq_entry& obj) { ... });
for_each (it, eos, [](
CBioseq& obj) { ... });
}
}
CObjectIStreamAsyncIterator.
bool IsValid(void) const
Check whether the iterator points to a data TRUE if the iterator is constructed upon a serialization ...
To speed up reading, the iterator offloads data reading, pre-parsing and parsing into separate threads. If the data stream contains numerous TRoot data records CObjectIStreamAsyncIterator can give up to 2-4 times speed-up (wall-clock wise) comparing to the synchronous processing (such as with CObjectIStreamIterator) of the same data.
The reader has to read the whole object into memory. If such objects are relatively small, then there will be several objects read into a single buffer, which is good. If data object is big it still goes into a single buffer no matter how big the object is. To limit memory consumption, use MaxTotalRawSize parameter.
The iterator does its job asynchronously. It starts working immediately after its creation and stops only when it is destroyed. Even if you do not use it, it still works in the background, reading and parsing the data.
Definition at line 336 of file streamiter.hpp.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4