title | description | keywords | |||
---|---|---|---|---|---|
Affinity |
dlt source for affinity.co |
|
DLT source for Affinity.
If you don't know DLT but stumbled across this when trying to search if you can get your data out of Affinity somehow: this will do it - it basically allows you to pull mostly any (except some enriched) data out of your Affinity instance and into a different target system (Snowflake, Postgres, etc.) that is supported by DLT.
Create a .dlt/secrets.toml
with your API key:
affinity_api_key="<YOUR_API_KEY>"
and then run the default source with optional list references:
from dlt_source_affinity import ListReference, source as affinity_source
pipeline = dlt.pipeline(
pipeline_name="affinity_pipeline",
destination="duckdb",
dev_mode=True,
)
affinity_data = affinity_source(
# By default the data source loads:
# - organizations
# - persons
# - lists
# - opportunities
# - notes
# And then we can optionally pass an arbitrary number of lists and list views:
list_refs=[
# Loads a list with ID 123,
# e.g. https://<your-subdomain>.affinity.co/lists/123/
ListReference(123),
# Loads a view with ID 456 in list 123,
# e.g. https://<your-subdomain>.affinity.co/lists/123/views/456-all-organizations
ListReference(247888, 1869904),
]
)
pipeline.run(affinity_data)
Resources that can be loaded using this verified source are:
Name | Description | API version | Permissions needed |
---|---|---|---|
companies | The stored companies | V2 | Requires the "Export All Organizations directory" permission. |
persons | The stored persons | V2 | Requires the "Export All People directory" permission. |
opportunities | The stored opportunities | V2 | Requires the "Export data from Lists" permission. |
lists | A given list and/or a saved view of a list | V2 | Requires the "Export data from Lists" permission. |
notes | Notes attached to companies, persons, opportunities | Legacy | n/a |
There are two versions of the Affinity API:
- Legacy which is available for all plans.
- V2 which is only available for customers with an enterprise plan.
This verified source makes use of both API endpoints. The authentication credentials for both APIs are the same, however, they differ in their authentication behavior.
dlt init affinity duckdb
Here, we chose duckdb as the destination. Alternatively, you can also choose redshift, bigquery, or any of the other destinations.
- You'll need to obtain your API key and configure the pipeline with it.
-
Install the necessary dependencies by running the following command:
pip install -r requirements.txt
-
Now the pipeline can be run by using the command:
python affinity_pipeline.py
-
To make sure that everything is loaded as expected, use the command:
dlt pipeline affinity_pipeline show
💡 To explore additional customizations for this pipeline, we recommend referring to the official
dlt
Affinity documentation. It provides comprehensive information and guidance on how to further
customize and tailor the pipeline to suit your specific needs. You can find the dlt
Affinity
documentation in
Setup Guide: Affinity.
This project is using devenv.
Run
generate-model