Fetch new pricing datasets for a one or many tickers at once or pull screeners from IEX Cloud (https://iexcloud.io), Tradier (https://tradier.com/) and FinViz (https://finviz.com/)
Examples
Fetch Intraday Minute Pricing Data
Fetch Intraday Option Chains for Calls and Puts
Fetch Intraday News, Minute and Options
fetch -t QQQ -g news,min,td
Debugging
Turn on verbose debugging with the -d
argument:
analysis_engine.scripts.fetch_new_stock_datasets.
fetch_new_stock_datasets
()[source]¶
Collect datasets for a ticker from IEX Cloud or Tradier
Setup
export IEX_TOKEN=YOUR_IEX_CLOUD_TOKEN export TD_TOKEN=YOUR_TRADIER_TOKEN
Pull Data for a Ticker from IEX and Tradier
Pull from All Supported IEX Feeds
fetch -t TICKER -g iex-all
Pull from All Supported Tradier Feeds
Intraday IEX and Tradier Feeds (only minute and news to reduce costs)
fetch -t TICKER -g intra # or manually: # fetch -t TICKER -g td,iex_min,iex_news
Daily IEX Feeds (daily and news)
fetch -t TICKER -g daily # or manually: # fetch -t TICKER -g iex_day,iex_news
Weekly IEX Feeds (company, financials, earnings, dividends, and peers)
fetch -t TICKER -g weekly # or manually: # fetch -t TICKER -g iex_fin,iex_earn,iex_div,iex_peers,iex_news, # iex_comp
IEX Minute
fetch -t TICKER -g iex_min
IEX News
fetch -t TICKER -g iex_news
IEX Daily
fetch -t TICKER -g iex_day
IEX Stats
fetch -t TICKER -g iex_stats
IEX Peers
fetch -t TICKER -g iex_peers
IEX Financials
fetch -t TICKER -g iex_fin
IEX Earnings
fetch -t TICKER -g iex_earn
IEX Dividends
fetch -t TICKER -g iex_div
IEX Quote
fetch -t TICKER -g iex_quote
IEX Company
fetch -t TICKER -g iex_comp
Note
This requires the following services are listening on:
localhost:6379
localhost:9000
A tool for showing how to build an algorithm and run a backtest with an algorithm config dictionary
import analysis_engine.consts as ae_consts import analysis_engine.algo as base_algo import analysis_engine.run_algo as run_algo ticker = 'SPY' willr_close_path = ( 'analysis_engine/mocks/example_indicator_williamsr.py') willr_open_path = ( 'analysis_engine/mocks/example_indicator_williamsr_open.py') algo_config_dict = { 'name': 'min-runner', 'timeseries': timeseries, 'trade_horizon': 5, 'num_owned': 10, 'buy_shares': 10, 'balance': 10000.0, 'commission': 6.0, 'ticker': ticker, 'algo_module_path': None, 'algo_version': 1, 'verbose': False, # log in the algorithm 'verbose_processor': False, # log in the indicator processor 'verbose_indicators': False, # log all indicators 'verbose_trading': True, # log in the algo trading methods 'positions': { ticker: { 'shares': 10, 'buys': [], 'sells': [] } }, 'buy_rules': { 'confidence': 75, 'min_indicators': 3 }, 'sell_rules': { 'confidence': 75, 'min_indicators': 3 }, 'indicators': [ { 'name': 'willr_-70_-30', 'module_path': willr_close_path, 'category': 'technical', 'type': 'momentum', 'uses_data': 'minute', 'high': 0, 'low': 0, 'close': 0, 'open': 0, 'willr_value': 0, 'num_points': 80, 'buy_below': -70, 'sell_above': -30, 'is_buy': False, 'is_sell': False, 'verbose': False # log in just this indicator }, { 'name': 'willr_-80_-20', 'module_path': willr_close_path, 'category': 'technical', 'type': 'momentum', 'uses_data': 'minute', 'high': 0, 'low': 0, 'close': 0, 'open': 0, 'willr_value': 0, 'num_points': 30, 'buy_below': -80, 'sell_above': -20, 'is_buy': False, 'is_sell': False }, { 'name': 'willr_-90_-10', 'module_path': willr_close_path, 'category': 'technical', 'type': 'momentum', 'uses_data': 'minute', 'high': 0, 'low': 0, 'close': 0, 'open': 0, 'willr_value': 0, 'num_points': 60, 'buy_below': -90, 'sell_above': -10, 'is_buy': False, 'is_sell': False }, { 'name': 'willr_open_-80_-20', 'module_path': willr_open_path, 'category': 'technical', 'type': 'momentum', 'uses_data': 'minute', 'high': 0, 'low': 0, 'close': 0, 'open': 0, 'willr_open_value': 0, 'num_points': 80, 'buy_below': -80, 'sell_above': -20, 'is_buy': False, 'is_sell': False } ], 'slack': { 'webhook': None } } class ExampleCustomAlgo(base_algo.BaseAlgo): def process(self, algo_id, ticker, dataset): if self.verbose: print( f'process start - {self.name} ' f'date={self.backtest_date} minute={self.latest_min} ' f'close={self.latest_close} high={self.latest_high} ' f'low={self.latest_low} open={self.latest_open} ' f'volume={self.latest_volume}') # end of process # end of ExampleCustomAlgo algo_obj = ExampleCustomAlgo( ticker=algo_config_dict['ticker'], config_dict=algo_config_dict) algo_res = run_algo.run_algo( ticker=algo_config_dict['ticker'], algo=algo_obj, raise_on_err=True) if algo_res['status'] != ae_consts.SUCCESS: print( 'failed running algo backtest ' f'{algo_obj.get_name()} hit status: ' f'{ae_consts.get_status(status=algo_res['status'])} ' f'error: {algo_res["err"]}') else: print( f'backtest: {algo_obj.get_name()} ' f'{ae_consts.get_status(status=algo_res["status"])} - ' 'plotting history') # if not successful
analysis_engine.scripts.run_backtest_and_plot_history.
build_example_algo_config
(ticker, timeseries='minute')[source]¶
helper for building an algorithm config dictionary
Returns: algorithm config dictionaryanalysis_engine.scripts.run_backtest_and_plot_history.
ExampleCustomAlgo
(ticker=None, balance=5000.0, commission=6.0, tickers=None, name=None, use_key=None, auto_fill=True, version=1, config_file=None, config_dict=None, output_dir=None, publish_to_slack=False, publish_to_s3=False, publish_to_redis=False, publish_input=True, publish_history=True, publish_report=True, load_from_s3_bucket=None, load_from_s3_key=None, load_from_redis_key=None, load_from_file=None, load_compress=False, load_publish=True, load_config=None, report_redis_key=None, report_s3_bucket=None, report_s3_key=None, report_file=None, report_compress=False, report_publish=True, report_config=None, history_redis_key=None, history_s3_bucket=None, history_s3_key=None, history_file=None, history_compress=False, history_publish=True, history_config=None, extract_redis_key=None, extract_s3_bucket=None, extract_s3_key=None, extract_file=None, extract_save_dir=None, extract_compress=False, extract_publish=True, extract_config=None, dataset_type=20000, serialize_datasets=['daily', 'minute', 'quote', 'stats', 'peers', 'news1', 'financials', 'earnings', 'dividends', 'company', 'news', 'calls', 'puts', 'pricing', 'tdcalls', 'tdputs'], timeseries=None, trade_strategy=None, verbose=False, verbose_processor=False, verbose_indicators=False, verbose_trading=False, verbose_load=False, verbose_extract=False, verbose_history=False, verbose_report=False, inspect_datasets=False, raise_on_err=True, **kwargs)[source]¶
process
(algo_id, ticker, dataset)[source]¶
Run a custom algorithm after all the indicators from the algo_config_dict
have been processed and all the number crunching is done. This allows the algorithm class to focus on the high-level trade execution problems like bid-ask spreads and opening the buy/sell trade orders.
How does it work?
The engine provides a data stream from the latest pricing updates stored in redis. Once new data is stored in redis, algorithms will be able to use each dataset
as a chance to evaluate buy and sell decisions. These are your own custom logic for trading based off what the indicators find and any non-indicator data provided from within the dataset
dictionary.
Dataset Dictionary Structure
Here is what the dataset
variable looks like when your algorithmâs process
method is called (assuming you have redis running with actual pricing data too):
dataset = { 'id': dataset_id, 'date': date, 'data': { 'daily': pd.DataFrame([]), 'minute': pd.DataFrame([]), 'quote': pd.DataFrame([]), 'stats': pd.DataFrame([]), 'peers': pd.DataFrame([]), 'news1': pd.DataFrame([]), 'financials': pd.DataFrame([]), 'earnings': pd.DataFrame([]), 'dividends': pd.DataFrame([]), 'calls': pd.DataFrame([]), 'puts': pd.DataFrame([]), 'pricing': pd.DataFrame([]), 'news': pd.DataFrame([]) } }
Tip
you can also inspect these datasets by setting the algorithmâs config dictionary key "inspect_datasets": True
DataFrame
objects.analysis_engine.scripts.run_backtest_and_plot_history.
run_backtest_and_plot_history
(config_dict)[source]¶
Run a derived algorithm with an algorithm config dictionary
Parameters: config_dict â algorithm config dictionaryA tool for plotting an algorithmâs Trading History
from a locally saved file from running the backtester with the save to file option enabled:
run_backtest_and_plot_history.py -t SPY -f <SAVE_HISTORY_TO_THIS_FILE>
analysis_engine.scripts.plot_history_from_local_file.
plot_local_history_file
()[source]¶
Run a derived algorithm with an algorithm config dictionary
Parameters: config_dict â algorithm config dictionaryPublish stock data in an s3 key to redis
Publish the contents of an S3 key to a Redis key
Steps:¶analysis_engine.scripts.publish_from_s3_to_redis.
publish_from_s3_to_redis
()[source]¶
Download an S3 key and publish itâs contents to Redis
Publish stock data in an s3 key to redis
Publish the aggregated S3 contents of a ticker to a Redis key and back to S3
Steps:¶analysis_engine.scripts.publish_ticker_aggregate_from_s3.
publish_ticker_aggregate_from_s3
()[source]¶
Download all ticker data from S3 and publish itâs contents to Redis and back to S3
Set these as needed for your S3 deployment
export ENABLED_S3_UPLOAD=<'0' disabled which is the default, '1' enabled> export S3_ACCESS_KEY=<access key> export S3_SECRET_KEY=<secret key> export S3_REGION_NAME=<region name: us-east-1> export S3_ADDRESS=<S3 endpoint address host:port like: localhost:9000> export S3_UPLOAD_FILE=<path to file to upload> export S3_BUCKET=<bucket name - pricing default> export S3_COMPILED_BUCKET=<compiled bucket name - compileddatasets default> export S3_KEY=<key name - SPY_demo default> export S3_SECURE=<use ssl '1', disable with '0' which is the default> export PREPARE_S3_BUCKET_NAME=<prepared dataset bucket name> export ANALYZE_S3_BUCKET_NAME=<analyzed dataset bucket name>Set Redis Environment Variables¶
Set these as needed for your Redis deployment
export ENABLED_REDIS_PUBLISH=<'0' disabled which is the default, '1' enabled> export REDIS_ADDRESS=<redis endpoint address host:port like: localhost:6379> export REDIS_KEY=<key to cache values in redis> export REDIS_PASSWORD=<optional - redis password> export REDIS_DB=<optional - redis database - 0 by default> export REDIS_EXPIRE=<optional - redis expiration for data in seconds>
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4