Fetching Price Data, Tracking Algo State ========================================= Blueshift callback functions are usually called with one or two arguments - ``context`` and ``data``. These are internally maintained objects that provide useful functionalities to the algo. User strategy can use the `context` object to query about the current state of the algo, including its profit-loss, positions, leverage or other related information. For fetching price data, the `data` object is used. See below for more. .. note:: The ``context`` and ``data`` variables are maintained by the platform and are made available to most callback APIs automatically. You need not instantiate these objects on your own. Also do not override their internal methods, else the algo will crash with error. Context Object -------------- The Blueshift engine uses the internal ``context`` object to track and keep the various state metrics of the running strategy up-to-date. The ``context`` object is an instance of an internal class as below. User strategy code never instantiates this object. This object is created and maintained by the platform core engine and provides interfaces to the current context of the running algorithm, including querying the current ``order`` status and ``portfolio`` status. The ``context`` object is the first (and sometimes the only) argument to all platform callback functions. .. py:module:: blueshift.algorithm.context .. autoclass:: AlgoContext() :noindex: Context Attributes +++++++++++++++++++ .. autoattribute:: AlgoContext.name .. autoattribute:: AlgoContext.mode .. autoattribute:: AlgoContext.execution_mode .. autoattribute:: AlgoContext.trading_calendar .. autoattribute:: AlgoContext.intraday_cutoff .. autoattribute:: AlgoContext.record_vars .. autoattribute:: AlgoContext.pnls .. autoattribute:: AlgoContext.orders .. autoattribute:: AlgoContext.open_orders .. autoattribute:: AlgoContext.open_orders_by_asset Portfolio and Account ++++++++++++++++++++++ The ``context`` objects provides interfaces to algo account and portfolio through attributes ``context.account`` and ``context.portfolio`` accessible from the user strategy. .. autoattribute:: AlgoContext.account .. autoattribute:: AlgoContext.portfolio The following example shows how to access account and positions data within the strategy code. .. code-block:: python def print_report(context): account = context.account portfolio = context.portfolio positions = portfolio.positions for asset in positions: position = positions[asset] print(f'position for {asset}:{position.quantity}') print(f'total portfolio {portfolio.portfolio_value}') print(f'exposure:{account.net_exposure}') def before_trading_starts(context, data): print_report(context) Data Object ----------- The ``data`` object is the second argument to platform callback functions (where applicable). This provides an interface to the user strategy to query and fetch data. .. py:module:: blueshift_data.readers.data_portal Fetching Current Data ++++++++++++++++++++++ .. autoclass:: DataPortal :noindex: .. automethod:: DataPortal.current Querying Historical Data +++++++++++++++++++++++++ .. autoclass:: DataPortal :noindex: .. automethod:: DataPortal.history .. versionchanged:: 2.1.0 - The `frequency` parameter now supports extended specifications, in addition to (`1d` and `1m`). This can be added as Pandas frequency format, e.g. `5T`, `30T` or `1H` for 5-minute, 30-minute and 1-hour candles respectively. This is primarily designed for live trading and using this in backtest can be very slow (as the underlying data is stored only in minute or daily format and other frequencies are resampled on-the-fly). The allowed parameters in live trading depends on the broker support, i.e. will fail if the broker does not support the particular candle frequency. Also, this is available only for data query with a single asset. For multiple assets, this will raise an error. .. important:: - These methods support the OHLCV ("open","high","low","close" and "volume") columns for all assets (except options), for backtests as well as live trading. For non-tradable assets (e.g. market index), "volume" may be zeros or missing. For options, open, high, low and volume fields are not available (see below for extra fields). - For backtest, apart from OHLCV columns, "open_interest" is also available for futures and options assets. This may not be available in live trading if the broker supports streaming data but does not support open interest in streaming data. - For backtest, when options asset(s), we can specify greeks, implied vol and ATMF forward as field names as well. Use "implied_vol" for the implied volatility levels and "atmf" for the prevailing at-the-money futures level. The supoprted greeks are "delta", "vega", "gamma" and "theta". The greek levels are computed using Black 76 model, on-the-fly, and may not be stable near the expiry. These fields may not be available in live trading (but user strategy can :ref:`compute` them easily on-the-fly). - These methods will handle a rolling asset specifications by picking the correct asset at each timestamp for which data is returned. For example, querying for `symbol('ABC-ICE+100')` for last 20 minutes will return the 100 out ATM call as applicable for each of the minutes (although the underlying levels, and hence the actual strikes may vary). The following example shows how to access current and historical data and what are the expected data types of the return values in various cases. .. code-block:: python from blueshift.api import symbol def initialize(context): context.universe = [symbol("AAPL"), symbol("MSFT")] def before_trading_start(context, data): # this returns a float value px = data.current(context.universe[0], 'close') # this returns an int value px = data.current(context.universe[0], 'volume') # this returns a pandas.Series with the columns in the index px = data.current(context.universe[0], ['open','close']) # this returns a pandas.Series with the assets in the index px = data.current(context.universe, 'close') # this returns a pandas.DataFrame with assets in the index px = data.current(context.universe, ['open','close']) # px1 is a Series with timestamps as index px1 = data.history(context.universe[0], "close", 3, "1m") # px2 is DataFrame with timestamp index and field names as columns px2 = data.history(context.universe[0], ['open','close'], 3, "1m") # px3 is a DataFrame with timestamp index and assets as columns px3 = data.history(context.universe, "close", 3, "1m") # px4 is a multi-index Frame with field names as columns, asset as the second index level px4 = data.history(context.universe, ["open","close"], 3, "1m") # to fetch all fields for an asset, use `xs` # this returns a regular Dataframe with columns as field names asset_o_price = px4.xs(context.universe[0]) # to fetch a field for all assets, use subsetting # this returns a regular Dataframe with columns as assets close_prices = px4['close'] Blueshift manages data in multiple layers. Actual raw data is stored internally, through a class named ``DataStore``. The core implementation of the ``DataStore`` class uses |ref_link|. The ``DataStore`` class provides low-level APIs to read and write data (from disk or a streaming source such as a websocket, or from an in-memory object). A high-level class ``Library`` handles (potentially multiple) ``DataStore`` instances with defined dispatch functionalities (to route a data query to appropriate ``store``, among many). This ``Library`` class implements the ``DataPortal`` interface above. The algo simulation queries this ``Library`` instance to fetch ``current`` data or data ``history``. .. |ref_link| raw:: html Apache Arrow