Historical migrations overview

Last updated:

|Edit this page|

Prior to starting a historical data migration, ensure you do the following:

  1. Create a project on our US or EU Cloud.
  2. Sign up to a paid product analytics plan on the billing page (historic imports are free but this unlocks the necessary features).
  3. Set the historical_migration option to true when capturing events in the migration. This is automated if you are running a managed migration.

Historical migrations refer to ingesting and importing past data into PostHog for analysis. This includes:

What about exporting data from PostHog? Use our batch export feature to export data from PostHog to external services like S3 or BigQuery.

The basics of migrating data into PostHog

Start your migration by formatting your data correctly. There is no way to selectively delete event data in PostHog, so getting this right is critical. This means:

  • Using the correct event names. For example, to capture a pageview event in PostHog, you capture a $pageview event. This might be different than the "name" other services use.

  • Including the timestamp field. This ensures your events are ingested with the correct time in PostHog. It needs to be in the ISO 8601 format and dated at least 48 hours before the time of import.

  • Use the correct distinct_id. This is the unique identifier for your user in PostHog. Every event needs one. For example, posthog-js automatically generates a uuidv7 value for anonymous users.

To capture events, you must use the PostHog Python SDK or the PostHog API batch endpoint with the historical_migration set to true. This ensures we handle this data correctly and you aren't charged standard ingestion fees for it.

Using our Python or Node SDKs, you can capture events like this:

from posthog import Posthog
from datetime import datetime
posthog = Posthog(
'<ph_project_api_key>',
host='https://us.i.posthog.com',
debug=True,
historical_migration=True
)
events = [
{
"event": "batched_event_name",
"properties": {
"distinct_id": "user_id",
"timestamp": datetime.fromisoformat("2024-04-02T12:00:00")
}
},
{
"event": "batched_event_name",
"properties": {
"distinct_id": "used_id",
"timestamp": datetime.fromisoformat("2024-04-02T12:00:00")
}
}
]
for event in events:
posthog.capture(
distinct_id=event["properties"]["distinct_id"],
event=event["event"],
properties=event["properties"],
timestamp=event["properties"]["timestamp"],
)

An example cURL implementation using the batch API endpoint looks like this:

curl -v -L --header "Content-Type: application/json" -d '{
"api_key": "<ph_project_api_key>",
"historical_migration": true,
"batch": [
{
"event": "batched_event_name",
"properties": {
"distinct_id": "user_id"
},
"timestamp": "2024-04-03T12:00:00Z"
},
{
"event": "batched_event_name",
"properties": {
"distinct_id": "user_id"
},
"timestamp": "2024-04-03T12:00:00Z"
}
]
}' https://us.i.posthog.com/batch/

Best practices for migrations

  • We highly recommend testing at least a part of your migration on a test project before running it on your production project.

  • Separate exporting your data from your service from importing it into PostHog. Store it in a storage service like S3 or GCS in between to ensure exported data is complete.

  • Build resumability into your exports and imports, so you can just resume the process from the last successful point if any problems occur. For example, we use a cursor-based approach in our self-hosted migration tool.

  • To batch user updates, use the same request but with the $identify event. Same for groups and the $group_identify event.

Questions? Ask Max AI.

It's easier than reading through 668 pages of documentation

Community questions

Was this page useful?

Next article

Managed migrations

Managed migrations provide an automated way to migrate your historical data into PostHog without writing custom scripts. With managed migrations, you can import data from multiple sources: Direct imports : Connect directly to Mixpanel or Amplitude using your API credentials S3 imports : Upload your event data to an S3 bucket in JSONL format for automatic ingestion Getting started Go to the managed migrations page Choose your import method (details about methods below) Import data Direct imports…

Read next article
OSZAR »