The Demyst Platform includes a few core components to allow your organisation to manage external data at scale. These guides will take you on a step-by-step journey through the configuration and deployment of Demyst's core components:
|Component||Summarised purpose||Starting guide|
|Connector||Connectors are the interface between the external data ecosystem and Demyst. External data and metadata is onboarded to create Connectors that are subsequently configured and exposed to clients as a Data API or Data Share. Connectors remove the complexity of managing different upstream data sources by delivering a single, unified experience for all external data sources.|
Through our connector guides, we will provide a walkthrough of how these connectors function, how to integrate them into your Data APIs, and tips for troubleshooting.
|Creating a Connector|
|Data API||Data APIs are the method for real-time transactional access to external data through Demyst. Demyst's configuration language provides a robust set of capabilities to tailor the API endpoint to your needs, such as the ability to redefine output field names and use a response from one connector as an input to another.|
Through our Data API guides, we will provide detailed documentation about Data API configuration, orchestration, and best practices.
|Creating and deploying a data API|
|Data Share||Data Shares are the method for asynchronous batch access to external data through Demyst. They allow for the ongoing ingestion of a data connector into landing zones such as data warehouses or lakes.|
Through our Data Share guides, we will provide detailed information about data share configuration, security protocols in place, and how to effectively manage and monitor the status of your data shares.
|Guide coming soon|
Connectors are an integration with an data source such as
- An upstream API
- A dataset hosted by a data provider or Demyst
- Other sources requiring a consent-based workflow (such as cloud accounting platforms or open banking integrations)
Connectors include metadata to aid in understanding and in use such as description, coverage, data provider information, a data dictionary. In situations where data providers offer multiple sources or multiple endpoints, or in situations where solutions are available in multiple countries, each of those is considered a different connector. Connectors can be deployed as either Data APIs or Data Shares.
Connectors remove the complexity of managing different upstream data sources by delivering a single, unified experience for all external data sources. Learn more about the features and benefits of connectors
Demyst recommends that clients engage directly with data providers to license data products for production usage. In some instances, Demyst is able to act as a reseller for data providers on an interim basis. You can learn more about this by contacting us.
Data APIs are a configured external data micro-service including a set of one or more Connector, a real time access interface, a configuration logic layer (e.g. timeouts, waterfalls, field selection, and derived attributes), and an optional algorithm (e.g. heuristic or predictive model). A Data API allows for real-time synchronous access in production via Demyst's REST endpoint, and includes transaction error tracking and reporting, as well as configuration and release management. You configure and deploy a Data API using Demyst’s proprietary language that enable you to:
- Apply your own custom logic or models on top of multiple data sources
- Quickly switch data sources to iterate solutions and improve for performance or cost
- Build redundancies to ensure your applications remain healthy
Data Share provide a secure and compliant means of delivering bulk data feeds to your organisation. Data Shares can range in complexity from delivering data in a standardised format from a single data provider, to utilising a Data API to enrich an existing dataset with information from one or more data providers. You may deploy data into a landing zone, such as in a cloud systems in one of the following ways:
- Batch data delivery, via periodic copy / paste in to shared destination e.g. sftp, S3 or equivalent, via push or pull
- Secure table sharing of ongoing managed copy of datasets, e.g. via Snowflake or equivalent
- Streaming, via delivery of incremental delta files e.g. via DataBricks, Kinesis, or equivalent
Data Share allow you to:
- Rapidly access large files of new external data
- Avoid building and maintaining costly data pipelines
- Use standardised delivery mechanisms to match your information security policies
- Experience data type standardisation and entity resolution for higher match rates
A one-way Data Share is a Data Share that has no inputs provided by the client to Demyst. Data is delivered as-configured.
A two-way Data Share is a Data Share that is created off a set of inputs provided by the client to Demyst. Data is delivered off the combination of the configuration and the matched set of data. The output is commonly referred to as an "enriched file" as the input data from the client is returned with additional data attributes appended.
Updated 3 months ago
Learn about the features of the Demyst Platform