Analyzer that provides embedded SQL syntax analysis when writing queries using Npgsql.FSharp. It verifies query syntax, checks the parameters in the query match with the provided parameters and performs type-checking on the functions that read columns from the result sets.
[<Literal>]
text and referencing itPackage | Stable | Prerelease |
---|---|---|
NpgsqlFSharpAnalyzer | ||
Ubik |
The analyzer requires a connection string that points to the database you are developing against. You can configure this connection string by either creating a file called NPGSQL_FSHARP
(without extension) somewhere next to your F# project or preferably in the root of your workspace. This file should contain that connection string and nothing else. An example of the contents of such file:
Host=localhost; Username=postgres; Password=postgres; Database=databaseName
Remember to add an entry in your .gitingore file to make sure you don't commit the connection string to your source version control system.
Another way to configure the connection string is by setting the value of an environment variable named NPGSQL_FSHARP
that contains the connection string.
The analyzer will try to locate and read the file first, then falls back to using the environment variable.
Download the latest version of NpgsqlFSharpVs.vsix
from the Releases page and double click the extension to install in on your system.
The analyzer requires a connection string that points to the database you are developing against. You can configure this connection string by either creating a file called NPGSQL_FSHARP
(without extension) somewhere next to your F# project or preferably in the root of your workspace. This file should contain that connection string and nothing else. An example of the contents of such file:
Host=localhost; Username=postgres; Password=postgres; Database=databaseName
Remember to add an entry in your .gitingore file to make sure you don't commit the connection string to your source version control system.
Another way to configure the connection string is by setting the value of an environment variable named NPGSQL_FSHARP
that contains the connection string.
The analyzer will try to locate and read the file first, then falls back to using the environment variable.
Use paket to install the analyzer into a specialized Analyzers
dependency group like this:
paket add NpgsqlFSharpAnalyzer --group Analyzers
DO NOT use storage:none
because we want the analyzer package to be downloaded physically into packages/analyzers
directory.
nuget install NpgsqlFSharpAnalyzer -OutputDirectory packages/analyzers
Make sure you have these settings in Ionide for FSharp
{
"FSharp.enableAnalyzers": true,
"FSharp.analyzersPath": [
"./packages/analyzers"
]
}
Which instructs Ionide to load the analyzers from the directory of the analyzers into which NpgsqlFSharpAnalyzer
was installed.
The analyzer requires a connection string that points to the database you are developing against. You can configure this connection string by either creating a file called NPGSQL_FSHARP
(without extension) somewhere next to your F# project or preferably in the root of your workspace. This file should contain that connection string and nothing else. An example of the contents of such file:
Host=localhost; Username=postgres; Password=postgres; Database=databaseName
Remember to add an entry in your .gitingore file to make sure you don't commit the connection string to your source version control system.
Another way to configure the connection string is by setting the value of an environment variable named NPGSQL_FSHARP
that contains the connection string.
The analyzer will try to locate and read the file first, then falls back to using the environment variable.
dotnet tool install ubik --global
cd ./path/to/project
ubik
ubik ./path/to/Project.fsproj
ubik ./File1.fs ./AnotherFile.fs
ubik --version
When it is not convenient to write a query inline like this:
Sql.query "HERE COMES A VERY LONG QUERY"
You can define the query as a module-level [<Literal>]
string and use it from the Sql.query
function like this:
let [<Literal>] selectActiveUsers = """
SELECT * FROM users
WHERE is_active = @is_active
"""
let activeUsers (connectionString: string) =
connectionString
|> Sql.connect
|> Sql.query selectActiveUsers
|> Sql.parameters [ "is_active", Sql.bit true ]
|> Sql.execute (fun read -> read.text "username")
The [<Literal>]
string has to be defined in the same module in order for the analysis to run properly.
Better yet, if you use the FSharp.Data.LiteralProviders package, you can write your SQL queries in external SQL files and load them in compile-time as a [<Literal>]
string to allow for the analyzer to pick it up:
let [<Literal>] selectActiveUsers = TextFile<"selectActiveUsers.sql">.Text
let activeUsers (connectionString: string) =
connectionString
|> Sql.connect
|> Sql.query selectActiveUsers
|> Sql.parameters [ "is_active", Sql.bit true ]
|> Sql.execute (fun read -> read.text "username")
Just remember that these [<Literal>]
strings have to defined in the same module where the query is written.
Use the Sql.skipAnalysis
function from main library to tell the analyzer to skip the analysis of a code block like this one:
open Npgsql.FSharp
let badQuery connection =
connection
|> Sql.query "SELECT * FROM non_existing_table"
|> Sql.skipAnalysis
|> Sql.execute (fun read -> read.int64 "user_id")
Make sure the following requirements are installed on your system:
> build.cmd <optional buildtarget> // on windows
$ ./build.sh <optional buildtarget>// on unix
The tests create and dispose a test database dynamically so you don't need to setup anything. This database is created using a default connection string that connects to your local Postgres server like this:
let createTestDatabase() =
Sql.host "localhost"
|> Sql.port 5432
|> Sql.username "postgres"
|> Sql.password "postgres"
|> Sql.formatConnectionString
|> ThrowawayDatabase.Create
Make sure you have a user with username and password called postgres
. An easy way to do this is to run the dockerized instance of postgres set up in the docker-compose.yml
file in the root of this repository:
repo root> docker-compose up -d
This will spawn a PostgresQL 12 database with the expected username and password, bound to the default port 5432. When you are done testing, this database can be destroyed like so:
repo root> docker-compose down
Clean
- Cleans artifact and temp directories.DotnetRestore
- Runs dotnet restore on the solution file.DotnetBuild
- Runs dotnet build on the solution file.DotnetTest
- Runs dotnet test on the solution file.GenerateCoverageReport
- Code coverage is run during DotnetTest
and this generates a report via ReportGenerator.WatchTests
- Runs dotnet watch with the test projects. Useful for rapid feedback loops.GenerateAssemblyInfo
- Generates AssemblyInfo for libraries.DotnetPack
- Runs dotnet pack. This includes running Source Link.PublishToNuGet
- Publishes the NuGet packages generated in DotnetPack
to NuGet via paket push.BuildDocs
- Generates Documentation from docsSrc
and the XML Documentation Comments from your libraries in src
.WatchDocs
- Generates documentation and starts a webserver locally. It will rebuild and hot reload if it detects any changes made to docsSrc
files, libraries in src
, or the docsTool
itself.ReleaseDocs
- Will stage, commit, and push docs generated in the BuildDocs
target.