Building an algorithmic trading bot often feels like taking a stab in the dark. You find a strategy, you spend some time/money building it, then you let it run and hope like crazy that it actually works.
Hopefully, you don’t lose too much money in the process.
Hopefully, you’ve got everything right.
Hopefully, it won’t take too long to figure out.
It doesn’t have to be that way.
Over the next few chapters, I’ll show you how to add methods to take your trading bot to the next level. Using practical development approaches such as trading bot strategy backtesting, trading bot trade analysis, and automated reporting, I’ll show you how to ensure you understand where your algorithmic trading bot can be improved.
The start of this process is building a way to record trading bot activity. In this chapter, I’ll show you how to use an open-source and widely used database system called PostgreSQL (also known as Postgres) to do exactly this.
P.S. I’ve had a few people reach out lately to ask if I’d be able to help them build their own trading bot. Use this link if this is you.
The book Build Your Own Algorithmic Trading Bot with Python covers everything you need to know to build your own trading bot. It includes incredible content for detecting indicators, developing strategies, and continuing to evolve your trading — all supported by an open-source GitHub repository. Check out the introduction to see a list of published and upcoming content.
Not financial advice. This article and the code provided are for use at your own risk. It isn’t financial advice, nor is it designed to make claims about profitability.
By the end of this chapter, you’ll have created the ability to store the trading activity of your trading bot in a PostgreSQL database.
This chapter assumes the following:
Implementing a database system into your trading bot is a powerful way to level up your algorithmic trading bot. Doing so allows you to:
This chapter prepares your trading bot to undertake all these actions and more.
A Common Information Model (CIM) is a development technique to ensure data format consistency. Doing so early and often ensures that your trading bot:
For this book, an constantly updated list of CIM entries can be found on the GitHub project here.
While there are many excellent database solutions, I’ll be using PostgreSQL (aka Postgres) to demonstrate. No matter what solution you’re looking for, some of the requirements you need are:
Postgres fulfills these requirements.
PostgreSQL bills itself as “The World’s Most Advanced Open Source Relational Database”. It is a well-used, highly performant database that can easily store millions of rows of data using the Structured Query Language syntax.
Postgres itself has been around for many years. It is used by many companies large and small and supports an open-source, free-to-use model. Should your trading bot reach a more professional level, it is easy to move this to a paid version.
I’ll start by ensuring that Postgres is connected to your trading bot.
trading_bot_db
settings.json
postgres
as follows:"postgres": {
"host": "your_hostname",
"database": "your_database",
"user": "your_username",
"password": "your_secret_password",
"port": "port_your_db_listens_on"
}
example_settings.json in the project GitHub demonstrates what this looks like.
sql_lib
in your trading botsql_interaction.py
Import the following packages to your project:
psycopg2
— link to a useful how to guidepsycopg2.extras
Create a connection function in sql_interaction.py
. Note the use of the try-except
statement:
def postgres_connect(project_settings):
"""
Function to connect to PostgreSQL database
:param project_settings: json object
:return: connection object
"""
# Define the connection
try:
conn = psycopg2.connect(
database=project_settings['postgres']['database'],
user=project_settings['postgres']['user'],
password=project_settings['postgres']['password'],
host=project_settings['postgres']['host'],
port=project_settings['postgres']['port']
)
return conn
except Exception as e:
print(f"Error connecting to Postgres: {e}")
return False
Before going to the next statement, check that the connection works. To do this, update your main.py
so that your __main__
looks like this:
# Press the green button in the gutter to run the script.
if __name__ == '__main__':
# Import project settings
project_settings = get_project_settings(import_filepath=import_filepath)
# Connect to Postgres
sql_connect = sql_interaction.postgres_connect(project_settings=project_settings)
print(sql_connect)
If all goes well, you should get a line printed to your terminal similar to this:
<connection object at 0x000001E06A151140; dsn: 'user=xxxx password=xxx dbname=trading_bot_db host=xxxx port=xxxx', closed: 0>
As a relational database system, Postgres uses a series of tables to store rows of data. These tables are stored in a database.
To store/retrieve data from a Postgres database, you need to define the:
In the previous section, you created a connection to a Postgres Database. Let’s expand that functionality to:
Interacting with Postgres is performed through the execute
part of a Postgres connection.
Create a function called sql_execute
in sql_interaction
:
def sql_execute(sql_query, project_settings):
"""
Function to execute SQL statements
:param sql_query: String
:return: Boolean
"""
# Create a connection
conn = postgres_connect(project_settings=project_settings)
# Execute the query
try:
# Create the cursor
cursor = conn.cursor()
# Execute the cursor query
cursor.execute(sql_query)
# Commit the changes
conn.commit()
return True
except (Exception, psycopg2.Error) as e:
print(f"Failed to execute query: {e}")
return Exception
finally:
# If conn has completed, close
if conn is not None:
conn.close()
Create a function in sql_interaction
for table creation, assuming the following:
Here’s the code:
# Function to create a table
def create_sql_table(table_name, table_details, project_settings):
"""
Function to create a table in SQL
:param table_name: String
:param table_details: String
:param project_settings: JSON Object
:return: Boolean
"""
# Create the query string
sql_query = f"CREATE TABLE {table_name} (id SERIAL PRIMARY KEY, {table_details})"
# Execute the query
return sql_execute(sql_query=sql_query, project_settings=project_settings)
A Postgres table uses columns to define the data that each row will contain. Functionally, this is very similar to an Excel spreadsheet — each column has a heading that defines what exists in the rows under it. The only difference is that in Postgres you must also define the exact data type to be entered.
To define the columns to use for your Python Trading Bot, I’ll use the CIM for live_trade_table
and paper_trade_table
:
{
"live_trade_table": {
"strategy": "String defining strategy",
"exchange": "String defining the exchange being used",
"trade_type": "String of the type of trade: BUY / SELL / BUY_STOP / SELL_STOP",
"trade_stage": "Stage of trade: order / position",
"symbol": "String of the symbol",
"volume": "Float of the volume",
"stop_loss": "Float of the stop loss value",
"take_profit": "Float of the take profit value",
"comment": "String of the comment",
"status": "String of the status: CANCELLED / PLACED ",
"price": "Float of the executed price",
"order_id": "String of a unique identifier for the order"
},
"paper_trade_table": {
"strategy": "String defining strategy",
"exchange": "String defining the exchange being used",
"trade_type": "String of the type of trade: BUY / SELL / BUY_STOP / SELL_STOP",
"trade_stage": "Stage of trade: order / position",
"symbol": "String of the symbol",
"volume": "Float of the volume",
"stop_loss": "Float of the stop loss value",
"take_profit": "Float of the take profit value",
"comment": "String of the comment",
"status": "String of the status: CANCELLED / PLACED ",
"price": "Float of the executed price",
"order_id": "String of a unique identifier for the order"
}
}
The CIM shows that all trade_tables
have a common set of columns. Therefore, a function called create_trade_table
can be created. Add this function to sql_interaction
:
def create_trade_table(table_name, project_settings):
"""
Function to create a trade table in SQL
:param table_name: string
:param project_settings: JSON Object
:return: Boolean
"""
# Define the table according to the CIM: https://github.com/jimtin/python_trading_bot/blob/master/common_information_model.json
table_details = f"strategy VARCHAR(100) NOT NULL," \
f"exchange VARCHAR(100) NOT NULL," \
f"trade_type VARCHAR(50) NOT NULL," \
f"trade_stage VARCHAR(50) NOT NULL," \
f"symbol VARCHAR(50) NOT NULL," \
f"volume FLOAT4 NOT NULL," \
f"stop_loss FLOAT4 NOT NULL," \
f"take_profit FLOAT4 NOT NULL," \
f"price FLOAT4 NOT NULL," \
f"comment VARCHAR(250) NOT NULL," \
f"status VARCHAR(100) NOT NULL," \
f"order_id VARCHAR(100) NOT NULL"
# Pass to Create Table function
return create_sql_table(table_name=table_name, table_details=table_details, project_settings=project_settings)
Now, you can update your main.py
to create both tables:
# Press the green button in the gutter to run the script.
if __name__ == '__main__':
# Import project settings
project_settings = get_project_settings(import_filepath=import_filepath)
# Create a paper_trade_table
sql_interaction.create_trade_table("paper_trade_table", project_settings)
# Create a live_trade_table
sql_interaction.create_trade_table("live_trade_table", project_settings)
P.S. If you’re wondering why there is an extensive focus on table creation and SQL interaction, all will be made clear when we build a better backtester together.
Building on the sql_execute
function, the final function in this section is called insert_trade_action
. This function inserts a row of data for a trade action. Here’s the code:
def insert_trade_action(table_name, trade_information, project_settings):
"""
Function to insert a row of trade data
:param table_name: String
:param trade_information: Dictionary
:return: Bool
"""
# Make sure that only valid tables entered
if table_name == "paper_trade_table" or table_name == "live_trade_table":
# Make trade_information shorter
ti = trade_information
# Construct the SQL Query
sql_query = f"INSERT INTO {table_name} (strategy, exchange, trade_type, trade_stage, symbol, volume, stop_loss, " \
f"take_profit, price, comment, status, order_id) VALUES (" \
f"'{ti['strategy']}'," \
f"'{ti['exchange']}'," \
f"'{ti['trade_type']}'," \
f"'{ti['trade_stage']}'," \
f"'{ti['symbol']}'," \
f"{ti['volume']}," \
f"{ti['stop_loss']}," \
f"{ti['take_profit']}," \
f"{ti['price']}," \
f"'{ti['comment']}'," \
f"'{ti['status']}'," \
f"'{ti['order_id']}'" \
f")"
# Execute the query
return sql_execute(sql_query=sql_query, project_settings=project_settings)
else:
# Return an exception
return Exception # Custom Error Handling Coming Soon
Now, update your main to insert a fake trade row. You’re welcome to use your own value — my fake trade row states that I’ve successfully place an order for BTCUSD at $18,501.21 with a Stop Loss of $17,000.50 and Take Proft of $25,000.30 using my TestStrategy.
if __name__ == '__main__':
# Import project settings
project_settings = get_project_settings(import_filepath=import_filepath)
# Define a fake trade row
trade_row = {
"strategy": "TestStrategy",
"exchange": "MT5",
"trade_type": "BUY_STOP",
"trade_stage": "ORDER",
"symbol": "BTCUSD",
"volume": 1.0,
"stop_loss": 17000.50,
"take_profit": 25000.30,
"price": 18501.21,
"comment": "Test Trade, ignore",
"status": "SUCCESS",
"order_id": "test_order"
}
# Add a row to paper_trade_table
sql_interaction.insert_trade_action("paper_trade_table", trade_row, project_settings)
In the next episode, I’ll show you how to connect this functionality to your trading bot.