Samples
In addition to the many ETL examples throughout the documentation (such as the extensive Slowly Changing Dimension Example), this article describes how to download, build and run the below ETL project samples on Windows and Linux.
The samples are listed in order from simple to more involved.
Instructions
Download Samples
Download and unpack the samples:
Setup
Ensure your development environment has the necessary bits, please see System Requirements. Also note:
The samples multi targets .NET Framework 4.6.2 (
net462, on Windows only), and .NET 6.0 (net6.0). Either install any missing SDKs from https://dotnet.microsoft.com/download/visual-studio-sdks or the Visual Studio installer, or edit each project file and remove unwanted frameworks from the two<TargetFrameworks>lines.If using Visual Studio, install ".NET desktop development" and ".NET cross-platform development" workloads
Populate the "Samples/actionetl.license.json" file with a FREE Community license, a free 30-day trial license, or a commercial license.
Run
In each project, restore dependencies, build, run and clean each sample application either via Visual Studio, or from the command line, picking a suitable target framework:
dotnet restoredotnet builddotnet run --no-build --framework net6.0dotnet clean
Important
The sample data input files and data output files are copied to and created in the runtime folder, e.g. in
bin/Debug/net6.0/
when running a Debug build on .NET 6.0. You can obviously change the location of data files as needed.
FileExists Sample

A simple application that checks if a file exists. This is the same application you
get when creating a new project with the
dotnet actionetl.console template. It:
- Creates and runs a worker system
- Uses a
FileExistsWorkerto check for a file - Reads the filename from a configuration file
- Flags success or failure via the application exit code
A detailed explanation of the sample is available in First actionETL Application.
CopyFileIfExists Sample 

A simple application that extends the above MyCompany.ETL.FileExists sample to copy
a file twice if it exists. It adds:
- Grouping workers
- Setting start constraints between workers
- Copy files
DataflowThroughput Sample

A fairly simple application that demonstrates:
- Generating multiple sets of similar workers via looping
- Generating dataflow test data by creating an enumerator using the
yieldkeyword - Dataflow row throughput with multiple parallel streams of data in a single worker system
- The rows are all generated in multiple sources, and redirected to different targets, where they then are discarded
- All work happens in-memory
- With a single source worker and reasonably new hardware, typical throughput is 40 to 60 million rows per second
- With more source workers, aggregate throughput increases, but reaches a maximum when the computer eventually becomes CPU limited
NOTE: For best performance, use 'Release' (and not 'Debug') configuration, and run it either with 'Debug > Start Without Debugging', or directly from the command line.
MultipleWorkerSystemsDataflow Sample 
This sample runs two worker systems in one application (or just the first one if you're using a community license). Here's the first one:

- Use a helper method to generate two similar sets of workers
- Use
RepeatRowsSourceworkers to generate copies of provided row(s) - Increase dataflow port buffer size which may increase performance, e.g. when not bottlenecked by external data sources
- Throw an exception if the worker systems fails
Second worker system:

- Use a
UnionAllTransformworker to combine the rows from multiple upstream workers - Access workers by name
- Access ports by index
ReadSortWriteXlsx Sample 

A fairly simple application that:
- Reads an XLSX spreadsheet file into the dataflow
- Maps between XLSX and internal column names
- Calculates the value of a new column by using a row property
- Sorts the rows descending on one column and ascending on a second column
- Writes the resulting rows to an XLSX spreadsheet file
- Formats monetary values with two decimals
AggregateCsvCreateInsertTable Sample 

An application that:
- Uses a helper method to generate two sets of workers that
drop (if they exist) and create master and detail SQL (e.g. staging) tables
- The used
AdbTableNonQueryWorkeralso has commands for deleting all rows, truncating the table, checking if the table exists etc.
- The used
- Reads data from a CSV file into the dataflow
- Duplicates the dataflow rows and:
- Inserts the original rows as is into the detail table
- Aggregates and inserts the cloned rows into the master table
Note
This sample requires setting the appropriate database provider in the "Program.cs" file and the connection string in the "actionetl.aconfig.json" file.
ProcessIncomingFiles Sample

A more involved application that:
- Creates a reusable custom worker to easily and reliably process many different feeds of data files
- Uses a helper method to create multiple similar sets of workers
- Loads data from XLSX and CSV files
- Executes database queries and calls stored procedures
- Inserts data into database tables
Note
This sample requires creating database scripts etc. before running, please see the:
- Project "Readme.txt" file
- actionETL documentation at Process Incoming Files Example