Structuring an F# project with SQL Type Provider on CI

Flavio Lucio
Publicado em
Grant mechanical calculation machine, 1877

Being able to move fast while assuring safety of the codebase is one of the dream capabilities of a development team. To achieve this goal, we, at Datarisk, leverage the use of Type Providers, a distinguished feature from the F# programming language. Although the benefits are shown to be quite profitable, things might get a bit complicated when dealing with Continuous Integration. The aim of this article is to provide our solution for the problem of generating types from a database server running at compile time on a CI environment.

For full reference, you can find the demo project under our Github repository.


F#’s Type Providers

F#’s type providers allows for usage of strong types when accessing structured data. Different than traditional ORMs that require the programmer to manually specify the type with which to read the data, the F# compiler infers a type from samples of the data. F# ships with a few type providers. To name a few: CSV, JSON, XML, and SQL.

Strongly Typed SQL

By using an SQL type provider, instead of writing your SQL queries as strings, one can write them with F#’s computation expressions. This allows several advantages such as auto-completion and compile checks. After getting used to SQL type providers one will often wonder why mankind has ever opted to write SQL queries in raw strings. After all, relational databases do have strong type definition of table columns. Why would you write queries in a language that doesn’t benefit from strong type guarantees?

Type providers and CI

Note however, that the compiler must have access to samples of the data. This is easy with a CSV and JSON since you can just point it to a file committed on the repository. For an SQL type providers, you need to grant the compiler access to a running database.

I will write this again: your compiler needs read access to a running database.

Furthermore, the database must have the schema that you expect to read on CI. This is easily done on the developer environment, but what about CI environments? Moreover, how to build a Docker image of a source code that requires a running database in order for compile?

These questions have prohibited our team from bringing SQL type providers to production for a while. We have worked out our solution, which is described on the remaining of this document.

Two databases on our production solution

Our solution to this problem uses two databases.

  • Runtime database: this is the database that you produce by running the migrations. On production, this contains your real data. The credentials for this database are strong (usually read from secret vaults during runtime).
  • Compile time database: this is a database that has a copy of the schema. It must be accessed by the compiler. The credentials for this database are written on the source code.

The two database solution emerged out of the necessity of having hard coded connection strings on the source code. Recall: every time you need the compiler, you need to access the database. Your auto-complete needs it and your test suite as well. We could not figure out a way to have the compile time database use strong credentials, so we opted to leave the connection string for this database directly on the source code.

The solution above is what we use on production. On this sample project however, we have simplified a bit and use a single database. When porting to your production app, make sure to differentiate between compile and runtime database.

The Stack

Dotnet core

The instructions on this manual were tested on dotnet 5.0.400. I expect the instructions to work on later versions of dotnet 5, but beware that some changes might be necessary.


The technology vision of our team specifically states that we seek to leverage Hindley-Milner type systems as business value to the best of our capabilities. We want compilers to help programmers write better software at cheaper implementation costs. We are big enthusiasts of strongly typed functional programming. We appreciate the fact that F# bridges modern research in programming languages and the time tested .NET platform.


We use DbUp to run our migrations. DbUp is a dotnet library for running migrations written in SQL files. The project src/Migrations uses it to expose:

  • A command line tool that runs migrations
  • A library that can be called from our main application in order to run migrations during runtime.


We think the relational model is the most solid way to store our data, and PostgreSQL is the best open-source RDBMS available. Indeed, it was built by great programmers trying to implement great ideas. The development of the POSTGRES project started at the University of California by the distinguished professor Michael Stonebraker after reading Edgar F. Codd’s seminal papers on the relational model. PostgreSQL has been around since 1995 and currently has a large user base. That is a good indicator of maturity, being the most advanced open-source database available anywhere.

Build Instructions

Building the project locally

First start the database with:

docker-compose up -d postgres

Then install the dependencies and apply the migrations:

dotnet tool restore
dotnet paket restore
dotnet run -p src/Migrations

You can now build your project with:

dotnet build

Building the project on CI

This article contains CI instructions for Github. To build a project using Github CI one needs to define a YAML file inside a folder named .github/workflows. Inside this file we need to perform two main steps:

  • Declare a Postgres service that will accept the same connection string that you use on your development environment.

image: postgres:12.6
POSTGRES_DB: postgres      
POSTGRES_USER: postgres      
POSTGRES_PASSWORD: "admin"                                            
POSTGRES_HOST_AUTH_METHOD: trust                                            
options: >-                                            
--health-cmd pg_isready                                                
--health-interval 10s                                            
--health-timeout 5s                                            
--health-retries 5                                        
- 5432:5432

  • Run the migrations before building your project

- uses: actions/checkout@v2                                
- name: Setup .NET                                    
uses: actions/setup-dotnet@v1                                      
dotnet-version: 5.0.400                                
- name: Restore dependencies                                    
run: dotnet tool restore && dotnet paket restore && dotnet restore                                
- name: Apply migrations                                    
run: sh/                                
- name: Build                                    
run: dotnet build --no-restore

For complete reference, the full YAML file can be found in the article’s repository.

Building a Docker image

One often builds a docker image with the command:

docker build

The command above fetches a base image and executes the commands outlined on Dockerfile one by one — each one building a new docker layer. The docker image we want to deploy is the final layer produced by this process. However, when executing the commands above, we need the dotnet compiler to have access to a running database.

We could not figure out if it makes to try to solve this problem with Docker or not. We solved it by having the host machine build the dotnet .DLL, and then copying these assets into the docker image.

We created a script that will:

  • Clear the environment of the host machine (changes to the repository as well as running containers)
  • Start a clean database
  • Apply migrations
  • Publish the dotnet executable assets
  • Build the docker image
  • The Dockerfile has a command that copies executable assets from the host machine into the Docker image

From the root of the repository execute:


You can run the app from the docker container with:

docker run -it --rm sqlprovider-demo:latest /app/build/SampleSQLProviderApp

Porting schema from runtime database into compile time database

We use the following commands to transfer the schema from production database (on these commands, assumed to run on port 5432) into the compile database (5433).

  • Create an SQL dump of your schema

PGPASSWORD=admin pg_dump -U dbuser -h -p 5432 -d demo -s -c -x -O — no-comments > db-schema.sql

  • Load the SQL dump on the compile time database

PGPASSWORD=admin psql -d postgres -U postgres -h -p 5433 < db-schema.sql


Using SQL type providers to access databases is certainly not a very common thing in the development community. But taking the risk to approach new challenges and eventually benefit from the consequences is one of our team’s core values. This article has shown how we have dealt with the problem of generating types from a database server running at compile time on a CI environment. For more articles from our technology unit, feel free to check our blog, and also visit our website to see the cool things we build.

I would like to thank Juarez Sampaio for co-authoring this text with me.

Mais artigos da Datarisk

Ver todos os artigos
© Copyright 2017 - 2022 | | Todos os direitos reservados