NetSuite SuiteAnalytics
DLH.io documentation for NetSuite SuiteAnalytics
The NetSuite SuiteAnalytics connector differs from our other NetSuite (Oracle NetSuite) connector in that the NetSuite (Oracle NetSuite) connector uses the NetSuite REST API based web services and requires you to deploy an integration application in your NetSuite instance on your own behalf.
There are differences in the NetSuite REST API web services connection so we recommend incorporating both connectors for your production instance especially during a migration or integration effort so that you can assertain what data from your NetSuite instance you are able to attain in your pipeline(s).
This NetSuite SuiteAnalytics connector is focused on using the NetSuite2.com connectivity feature of NetSuite, which has also become known as the analytics data source, or SuiteAnalytics Connect. If your NetSuite instance currently only supports the older NetSuite.com option we highly suggest you work with your NetSuite success team to switch to the newer version since the older version 2018.2 will not receive any new development as of October 2021.
Pre-Requisites for NetSuite SuiteAnalytics Connector:
- A NetSuite SuiteAnalytics license
- NetSuite Administrator access to setup the integration
- Tokens and Keys from the integration
- Ability to assign the Data Warehouse Integrator (DWI) role
- Set your Service Data Source connection attribute to NetSuite2.com
Setup Instructions
Follow the setup steps guide for configuring your NetSuite SuiteAnalytics connection to enable data to flow into your destination via DLH.
Supported Features
Here are key features supported for this connector.
| Sync Feature | Supported | Details |
|---|---|---|
| Custom Data and Development | ✓ | Ability to enhance connector upon request |
| Historical Re-Load/Load | ✓ | |
| Incremental/Delta Load | ✓ | Gets most recent records and changes |
| Column Selection | ✓ | |
| Column Hashing | ✓ | |
| Re-Sync Table/Entity | ✓ | Select at the table level to reload data history (on next Sync Bridge run) |
| Custom Queries | Utilizing SQL Data Query Connector | |
| Custom Data | ||
| Captures Deleted Rows | ✓ | On all supported tables |
| API Sync Bridge Initiation | ✓ | |
| Priority Scheduling | ✓ | |
| Private VPC/Link | ☂ | |
| DLH Data Model Available | - |
If you have any questions about these supported features please reach out to our Customer Support team.
Details on Sync Processing
For this connector, we believe the sync processing is straightforward. We've provided a number of details, steps, and other guidance here and in the setup steps guide. Be sure to also check the change log and notes page from time to time for any changes.
Data Replication Details
This section provides information on how data is replicated, at what frequency and which replication/synchronization details are worth noting from DLH.io.
| Replication Frequency Configurations | Details |
|---|---|
| Default Replication Frequency | 24 Hours |
| Minimum Frequency | 1 Hour (lower on Business Critical plan) |
| Maximum Frequency | 24 Hours |
| Custom Frequency | 1-24 Hours |
Replication Definitions
The following types of replication happen between this source and your target destination cloud data warehouse of choice when a Sync Bridge runs:
-
Historical Data Load (also First Time Sync)
- Occurs during the first/initial data sync of the sync bridge pipeline and any time you need to or are requested to conduct a historical re-sync process for your data.
- Here, DLH.io ingests all available data for the objects/tables you've selected or are available to you from the source based on your authentication access on that source.
- This can take a relative elongated time to retrieve all data from the source and replicated to the target, as the name suggest it is retrieval all data, so if the source contains large amounts of data even in our parallel processing capability a customer could expect more than an hour for large data system. If there are any concerns that a historical load or first time sync has not completed within a reasonable amount of time please contact DLH.io Support.
-
Incremental (aka Deltas) Data Load
- After a first time synchronization/replication or a historical data load, all subsequent processes of replicating the data for a Sync Bridge (source to target) are referred to as delta or incremental loads
- Here, DLH.io captures the latest/newest records and/or events and any changes/updates to existing records and/or events in the source connector based on the frequency set in the Sync Bridge.
Connector Considerations or Limitations
None at this time.