Database static and streaming provider

Modified on 2017/10/02 15:31 by Eugene — Categorized as: Providers

What is it

Users have expressed demand to connect to databases, and Wealth-Lab listened: this provider helps connect to any ADO.NET compatible datasource. Out of the box, a typical PC with .NET framework supports Oracle, OleDb, ODBC, SQL Server and SQL Compact.

Features


Creating a new DataSet

To work with a database containing some data, you need to create a new static DataSet first. Pick the appropriate Provider from the list of available ones (e.g. System.Data.OleDb/ODBC for Excel, Access, text files and lots more, System.Data.SqlClient for SQL Server etc.), compose a connection string, and you're ready for the next step.

Usually, this is a no-brainer for the database guys, but should you feel lost in connection strings, clicking the bluish Connection string Help label will try to guess what kind of database you're trying to connect to, using the bits and pieces of your connection string and/or selected provider, and take you to the relevant page of the connectionstrings.com site.


A rule of thumb is to first type in a connection string and only then proceed with selecting a provider. When you're finished with this step, the provider will populate the Table name dropdown box with the list of tables found. Select the one you need to make the provider fill the other boxes with available fields. Upon defining a field to serve as Security Name, the "Symbols" area is populated with found tickers. Note: when database contains thousands of symbols, this step may take a while (several minutes). Please be patient.

Match available fields to DateTime, Open, High, Low and Close. Volume and Named Series are optional. Note: for symbol names, you should use the varchar() type, not text. The provider won't let you confuse a DateTime field for prices and vice versa. To finish creation of your DataSet, select a bar scale and interval.

New DataSet Wizard

New DataSet Wizard

Setting up Streaming quotes

It's possible to use one source for static data while getting your streaming quotes from a different database (provided that the symbol exists in both). Setting up the provider for Streaming quotes is a two-step process:

  1. At any time, only one data source for streaming quotes can be defined. It's a global setting. To specify a database to use for real-time quotes, open the Data Manager tool, switch to the Database provider's tab, and repeat there all steps taken when completing the "New DataSet Wizard". For streaming quotes, only the Close price is required, Volume is optional.

    Configure streaming source

    Configure streaming source


  2. Finally, set "Database provider" as your current streaming provider in the Wealth-Lab's Preferences dialog, Streaming.

Notes:

  1. Push "Clear" button before first-time configuration or when changing settings.
  2. You must restart WL6 after changing Streaming settings.





Usage Notes and Limitations


Driver={MySQL ODBC 5.1 Driver};incorrect
Driver=MySQL ODBC 5.1 Driver;correct
Driver={Microsoft Access Driver (*.mdb, *.accdb)};incorrect
Driver=Microsoft Access Driver (*.mdb, *.accdb);correct

Bonus: Bulk import of data in CSV files to an SQL Server database

This companion article illustrates how to use a Wealth-Lab Strategy to import the data from a large group of ASCII files into an SQL Server database.

Problem

You have a big bunch of ASCII CSV files from your data vendor containing DTOHLCV series for a large group of stocks. You would like to import all the files into an SQL Server database to run Strategies using the Database provider.

Here are a few lines from a data file:

Date,Open,High,Low,Close,Volume,Adj Close
2009-12-31,6.00,6.00,6.00,6.00,400,6.00
2009-12-30,5.91,5.91,5.89,5.89,13500,5.89
2009-12-29,5.83,5.96,5.83,5.90,12700,5.90
...

As we can see, the symbol is missing - it's a part of the file name e.g. TEST.CSV. Obviously, neither it makes sense to import the data for a group of symbols into a single table (the provider won't be able to tell one symbol from another), nor can we put each instrument in an individual table (too time consuming and inconvenient).

Creating an SQL Server database

First, set up a new database - for example, using Microsoft SQL Server Management Studio. Next, create a new Table with the following properties:

Symbolvarchar(10)non-null
Datedatetimenon-null
Openfloatnon-null
Highfloatnon-null
Lowfloatnon-null
Closefloatnon-null
Volumebigintnullable
AdjClosefloatnullable

Note that symbol names have to be of type varchar. Now, it's time to index the table. Obviously, the Symbol field has to be indexed in any case or querying the database would be too slow. If you don't plan updating the table too often, use more indexes - otherwise, don't over-index your table.

Coding the bulk import routine

Although a command line utility could be created, why bother with a C# IDE or .NET SDK when we have the Wealth-Lab's built-in Strategy Editor?

One obvious problem is that the data files are missing the Symbol column that is the index of the table. We're going to get the current symbol from the file name (i.e. Ticker.CSV) and insert the missing piece on-the-fly.

Below you'll find the complete code of a WealthScript program that imports all .CSV files from a directory into a single table. Key parameters such as the connection string, table name and source folder are configurable. Follow the comments to understand the logic.

Note: before using this code, in the Wealth-Lab's Editor, open the "References..." dialog and tick two checkboxes - next to System.Data and System.Xml.

using System; using System.Collections.Generic; using System.Text; using System.Drawing; using WealthLab; using WealthLab.Indicators; using System.Data; using System.Data.SqlClient; using System.IO; using System.Globalization;

namespace WealthLab.Strategies { public class SQLBulkImportCSV : WealthScript { /// /// Database name /// public static string dbName = "Database"; /// /// Path to directory with source CSV files /// public static string baseDir = @"C:\Data\"; /// /// Table name for bulk import /// public static string dirName = "Table"; /// /// SQL Server connection string /// public static string connString = @"server =.\MySQLServer;database=" + dbName;

/// /// SQL Bulk import routine /// /// Symbol name /// Full path to the symbol's data file /// static void ImportCSV(string ticker, string filepath) { using( SqlConnection con = new SqlConnection(connString) ) { // Split the file line by line StreamReader sr = new StreamReader(filepath); string line = sr.ReadLine(); // Here are the columns found in the data: Date, Open, High... string[] value = line.Split(','); string">0"> result = new string[0; // An extra column for the Symbol field int newColumns = value.GetLength(0) + 1;

// A DataTable for putting the CSV data in System.Data.DataTable dt = new System.Data.DataTable(); DataRow row; // Insert new column "Symbol" in CSV file being parsed (missing in the data) dt.Columns.Add(new DataColumn("Symbol")); // Add the other columns (DOHLCV+custom series) foreach (string dc in value) { dt.Columns.Add(new DataColumn(dc)); }

// Prefill a string array with the Symbol name string[] sym = { ticker, "," }; int i = 0; // Read the CSV file line by line while (!sr.EndOfStream) { string source = sr.ReadLine(); // Skip empty lines and garbage if (source.Length <= 1) continue; // Treat the CSV comma-separated line as an array value = source.Split(','); // Resize the array to add space for the Symbol column Array.Resize(ref result, newColumns); // Insert the parsed row Array.Copy(value, 0, result, 1, value.Length); // Add the ticker to the very beginning result0 = ticker; DateTime _dt = DateTime.Now;

if (result.Length == dt.Columns.Count) { row = dt.NewRow(); // Assign the resulting string array to a new DataRow row.ItemArray = result; if (i != 0) dt.Rows.Add(row); } i++; }

// Use the SqlBulkCopy class to write the DataTable made of a single stock's CSV file, to SQL Server SqlBulkCopy bc = new SqlBulkCopy(con.ConnectionString, SqlBulkCopyOptions.TableLock); bc.DestinationTableName = dirName; bc.BatchSize = dt.Rows.Count; con.Open(); bc.WriteToServer(dt); bc.Close(); con.Close(); } } protected override void Execute() { // Get the collection of Files from the Source Directory string destination = Path.Combine(baseDir, dirName); DirectoryInfo dirCustom = new DirectoryInfo(destination); FileInfo[] filCustom = dirCustom.GetFiles();

// Loop by each file with .CSV extension, importing its data to SQL Server foreach (FileInfo f in filCustom) { ImportCSV(f.Name.Replace(".csv",string.Empty), f.FullName); } } } }