Accessing Historical Dynamics GP Data in NetSuite: The Azure Data Lake - eOne Solutions (2023)

Through virtual integration with Popdock, it is entirely possible and easy to leave old systems behind, turn off old SQL Servers, and have quick access to historical data whenever you need it without having to transfer that data to your new system.

In our previous article of this series, we showed you a couple of examples of how to access historical Dynamics GP data inside the NetSuite interface, along with how to embed it inside a customer dashboard.

In this article, we will demonstrate how to bring data from an on-premise system to an Azure Data Lake, and how to use the Popdock Data Lake Upload Tool.

How do you get data from the on-prem system to the Azure Data Lake?

One of the most important steps in the process of viewing old Dynamics GP data inside your NetSuite interface is to first move that historical data over to an Azure Data Lake. But the question is, how do you accomplish this?

Popdock’s Data Lake Upload Tool can help you achieve this task effortlessly. We’ll show you how to connect to your historical data and transfer it with ease.

With this Popdock feature, you can not only connect to all your SmartLists and favorite settings, but also move them to the Azure Data Lake. Once there, you can easily utilize them within NetSuite.

First, let’s start by learning how to connect to your historical data.

Connecting to the data: Connect to SmartLists and favorites to move to an Azure Data Lake

Here are some key points to consider:

  • This process doesn’t require you to expose your SQL Server to the internet or punch holes in your firewall, which is a significant security concern. In the past, we used a gateway to mitigate this concern, but with this updated process, we no longer need to do that. The updated process allows you to run it right from your network and push it out instead of pulling it.

  • This process enables you to transfer all historical Dynamics GP SmartLists, including the ones built with SmartList Builder and Microsoft’s less robust version of SmartList Builder, Designer. These lists arrive with human-readable names, making them easy to use within NetSuite.

  • This process allows you to archive the entire database, including all SQL tables, not just GP data. This is essentially like an insurance policy that enables you to transfer all third-party products and custom tables to the data lake, making the information accessible in the cloud.

Moving the data: Move your SmartLists and favorites to an Azure Data Lake

After you have connected to the data, how do you move it?

It’s important to note that Popdock is designed to run in your environment, not remotely or from the cloud. This means you can run the tool in your on-premise system and connect it to your systems without needing to set up another server. The gateway of the past is no longer necessary because you can simply hook it into your system and run it from your workstation.

With Popdock, you can extract data from Microsoft GP and move it to the Azure Data Lake as quickly as possible. While the amount of data you have will affect the speed of this process, it still runs incredibly fast. For instance, it will take longer to extract a terabyte of data than it would to extract 10 gigabytes. The tool runs SQL statements and dumps the data into CSV files.

Once the extraction is complete, the uploading process begins. The speed of this process depends on your internet connection. If you have a slow internet connection, this will take longer and vice versa.

(Video) Accessing Historical Dynamics GP Data in NetSuite

Fixing errors in a list

In the tool we’re about to show you, you have the capability to fix any errors that may occur in a list, which is an incredible asset to this process.

As an example, let’s say that your IT team chose to do a backup in the middle of the night. While it was running through this process, you lost connection to your server and half of the tables weren’t able to be read. If that occurs, Popdock will allow you to pick up right where you left off. You can show up in the morning, go home, and if not all of the 1,000 tables or lists you were uploading made it, you can resubmit those tables or lists.

That’s a crucial piece of what Popdock allows you to do. It enables you to interact with and fix any problems that may occur during the uploading process.

How to use the Popdock Data Lake Upload Tool

Next, we’re going to show you how to use the Popdock Data Lake Upload Tool.

As we mentioned in the last article, this tool is a part of Popdock. It does not cost extra to use. This tool can be installed on your network and allows you to extract information.


When you log in, it will ask if you want to work with SmartLists or tables.

For today’s example, let’s begin with SmartLists.

After making your selection, click on the “Continue” button at the bottom right-hand side of the window.


Next, enter your Popdock login credentials and then click on “Log in.” This will connect you to your Popdock account where some of the setup information is located.


Next, let’s proceed with logging in to your Azure Data Lake account. This step is crucial because we will be archiving this information into your data lake.


After selecting these specifications, click on “Connect.” This will initiate the connection process with the connector.

Now, let’s move on to the most important part. Since we are working on-premise, we need to hook into our GP server so we can really access the data effectively.

(Video) Accessing Historical Dynamics SL Data in Netsuite


Popdock includes all the past versions of GP, so choose which version you need to connect to from the dropdown options.


After choosing your version, click on “Connect” once more.


Now you can see all the companies that you have.

In our example, the setup is straightforward as we only have three companies listed. However, some of our customers have anywhere from 70 to well over 100 companies, and many users don’t want to move each one. There are a few companies that may have been test companies, are no longer in use, or were consolidated.


For example, in the case of our customer that had 70 companies, they had no need to archive 20 of those companies.

If you don’t want to archive a company, simply uncheck the ones you don’t want.

In this demonstration, we’ll uncheck “Three’s Company” because we don’t have any data in that company.

After selecting the companies you wish to archive, click on “Next: Select lists.”


On the next page, you’ll see all the lists. These are your SmartLists: All the standard SmartLists and the lists that were created with SmartList Builder.


As you scroll down, you will notice that we have added SmartLists and that these aren’t only the default lists included. These include the eQA SmartLists which are lists that were created inside the system for testing purposes. All these SmartLists were built with SmartList Builder, and there’s even one in here that was built using SmartList Designer.


Next, you’ll select the lists you want to move. After you have selected your lists, click on “Next: Run.”



If you only need to access specific information such as payables or receivables, you have the flexibility to do so.

(Video) Moving from Dynamics GP & Accessing Your Historical Data

You can see it’s running 22 transactions, although we didn’t select that many lists. Why is it doing that? It’s running the 22 transactions for both companies that we selected: Fabrikam and Fore Iron Foundry. It is going through each one of them to retrieve the data.


For the Fabrikam company alone, there are over 300,000 records to go through, which means it will require some additional time to process.

Once we have completed processing this list of 300,000 rows, the remaining records will be processed quickly.

After the extraction is complete

Once all the rows have been extracted, the data goes through the uploading process.


Most of these records succeeded for the Fabrikam company. However, a few of them didn’t have any records, so they were skipped during the process.


Further down, you can see that there was a failure to upload a particular record. By hovering over it, we can see that, there appears to be a problem with our calculation.


To go back and fix this record and then rerun it, you can select it and click on “Re-run selected failures.” Then it will rerun only that specific list.

If there is an issue with a record and you never fix it, it will continue to show up as a failure.


Choosing tables instead of SmartLists

The other option we want to show you is running tables instead of SmartLists.

On the welcome page of the Popdock Data Lake Upload Tool, select “Dynamics GP – Copy tables,” then click on “Continue” to log in to your Popdock account.

(Video) Migrating Away From Dynamics GP? - Keep Your Historical Data



Then type in your data lake name into the “Table container” field and click on “Connect.”


Select your version of GP and click on “Continue.”

On the next page, you will notice a difference. When you select SmartLists, you have more options to choose from. However, with tables, you won’t have the same level of customization and selection.

It allows you to specify which databases you want to archive. Whether there are 1,000 tables or 2,500 tables like in the case of the Fabrikam company, it will archive every database that you want. It pulls all the tables, whether they’re Microsoft or not.

After you select “Next: Run,” this process will proceed. If you have a large amount of data, it may run for a few hours.

We provide you with the list you’ll be using and ensure that everything is backed up and available for future reference – an insurance policy for your peace of mind.

Next up: Helpful Popdock features

Now, you can securely connect to your Dynamics GP data without having to poke holes in your firewall. You can quickly move to an Azure Data Lake and conveniently access and report on the data you need inside NetSuite.

With Popdock, you’re able to take on tasks that would normally only be limited to developers. We’ve made it easier than ever for consultants and end users to be able to connect, move, and report on data in whatever interface is needed. It’s simple to access, filter, and report on the data without leaving NetSuite (or any other system).

In our next how-to article, we’ll show you some extra Popdock features that will help you during your entire Dynamics GP data uploading process to NetSuite.

Are you ready to learn more about how Popdock can help your NetSuite experience? Contact one of our Popdock experts today at sales@eonesolutions.com or 888-319-3663 ext. 1. They are always eager to provide help and answers!

(Video) Go Beyond Viewing Data Inside of Dynamics GP

FAQs

How do you query data from data lake? ›

Querying data in your Amazon S3 data lake
  1. In the tree-view panel, choose the schema.
  2. To view a table definition, choose a table. The table columns and data types display.
  3. To query a table, choose the table and in the context menu (right-click), choose Select table to generate a query.
  4. Run the query in the Editor.

What is the difference between Azure data Factory and Azure Data Lake? ›

ADF helps transform, plan, and load the data following project requirements. Meanwhile, Azure Data Lake is a highly secure and scalable data lake storage for the storage of optimal workloads. It can effectively store organized, semi-structured, and unstructured data.

What is Azure Data Lake storage used for? ›

Azure Data Lake includes all the capabilities required to make it easy for developers, data scientists, and analysts to store data of any size, shape, and speed, and do all types of processing and analytics across platforms and languages.

What is Azure Data Lake Analytics? ›

Azure Data Lake Analytics is an on-demand analytics job service that simplifies big data. Easily develop and run massively parallel data transformation and processing programs in U-SQL, R, Python, and . NET over petabytes of data.

How do I extract data from Azure Data Lake? ›

Configure an export
  1. Go to Data > Exports.
  2. Select Add export.
  3. In the Connection for export field, choose a connection from the Azure Data Lake section. ...
  4. Enter a name for the export.
  5. Enter the folder name for the Azure Data Lake Storage Gen2 storage.
Feb 5, 2023

Can we query data in Azure Data Lake? ›

You can analyze and query data without prior ingestion into Azure Data Explorer. You can also query across ingested and uningested external data simultaneously. For more information, see how to create an external table using the Azure Data Explorer web UI wizard.

What type of data can be stored in Azure Data Lake? ›

Azure Data Lake is one of the leading cloud platforms that support big data analytics, provide unlimited storage for structured, semi-structured, or unstructured data, and store any type of data of any size.

What data format does Azure Data Lake use? ›

A data lake is a storage repository that holds a large amount of data in its native, raw format. Data lake stores are optimized for scaling to terabytes and petabytes of data. The data typically comes from multiple heterogeneous sources, and may be structured, semi-structured, or unstructured.

What is the difference between database and data lake? ›

What is the difference between a database and a data lake? A database stores the current data required to power an application. A data lake stores current and historical data for one or more systems in its raw form for the purpose of analyzing the data.

What's the difference between data warehouse and data lake? ›

While data warehouses store structured data, a lake is a centralized repository that allows you to store any data at any scale. A data lake offers more storage options, has more complexity, and has different use cases compared to a data warehouse.

What are security components in Azure Data Lake? ›

Data Lake Storage provides six different layers of security: authentication, access control, network isolation, data protection, advanced threat protection, and auditing. ADLS supports three different authentication methods. Azure Active Directory is the ideal way to verify a user's identity.

How do I connect to Azure Data Lake? ›

Connect to Azure Data Lake Storage
  1. Go to Data > Data sources.
  2. Select Add a data source.
  3. Select Azure Data Lake Storage Gen 2.
  4. Enter a Data source name and an optional Description. ...
  5. Choose one of the following options for Connect your storage using.
Apr 10, 2023

Is Azure Data Lake a SQL database? ›

Azure Data Lake Analytics supports the creating of two types of U-SQL tables. One is managed tables, which is the regular tables that we create using Azure Data Lake Analytics as the data source that we saw in the earlier part of this series.

How do I monitor Azure Data Lake? ›

Sign on to the Azure portal. Open your Data Lake Analytics account and select Diagnostic settings from the Monitoring section. Next, select + Add diagnostic setting.
...
Enable logging
  1. You can choose to store/process the data in four different ways. ...
  2. Specify whether you want to get audit logs or request logs or both.
Mar 16, 2023

What language is used in Azure Data Lake? ›

Create a sample job for Azure Data Analytics. Azure Data Lake Analytics uses the U-SQL for query and processing language. The U-SQL language has a combination of SQL and programming language C#.

What is the file structure for Azure Data Lake storage? ›

Hierarchical directory structure

The hierarchical namespace is a key feature that enables Azure Data Lake Storage Gen2 to provide high-performance data access at object storage scale and price.

Which 2 of the following are benefits of running queries against the data lake? ›

Benefits of Data Lake queries

They always give results for all endpoints, whether they're connected or not. They can query data that's up to 30 days old. You can configure the time period so that they only generate as much data as you need. They can be scheduled.

Which tool can be used for data ingestion in Azure Data Lake? ›

Azure Data Explorer provides SDKs that can be used for query and data ingestion.

What is the file size limit for Azure Data Lake? ›

5 TB file size limit.

What are the advantages of Azure Data Lake? ›

Why Should I Use Azure Data Lake?
  • Cost Optimization. ...
  • Provides Benefit of Cloud Scale and Performance. ...
  • Seamless Integration with Other Azure Services. ...
  • Simplifies Big Data Analytics. ...
  • Built-in Data Cloud Governance.
Mar 1, 2022

What is the best format for data lake? ›

STORAGE FORMATS: A PRACTICAL PERSPECTIVE

Text Files – Information will often come into the data lake in the form of delimited text, JSON, or other similar formats. As discussed above, text formats are seldom the best choice for analysis, so you should generally convert to a compressed format like ORC or Parquet.

Is Azure Data Lake PaaS or Saas? ›

Azure Data Lake is a cloud-based PaaS solution for huge data storage. Trillions of files up to a petabyte in size can be supported. Hundreds of gigabytes of throughput azure storage blob technology with hierarchical namespace access using the Hadoop distributed file system (HDFS).

What file formats are open in data lake? ›

Qubole supports all the major open-source formats like JSON, XML, Parquet, ORC, Avro, CSV, etc. Supporting a wide variety of file formats adds flexibility to tackle a variety of use cases.

Can a data lake replace a database? ›

Data lakes are very popular in the modern stack because of its flexibility and costs but they are not a replacement for data warehouses or relational databases.

Is a data lake just a file system? ›

A data lake is a system or repository of data stored in its natural/raw format, usually object blobs or files.

Is data lake only for big data? ›

A data lake is a centralized repository designed to store, process, and secure large amounts of structured, semistructured, and unstructured data. It can store data in its native format and process any variety of it, ignoring size limits.

Do you need a data warehouse if you have a data lake? ›

A data lake is not a direct replacement for a data warehouse; they are supplemental technologies that serve different use cases with some overlap. Most organizations that have a data lake will also have a data warehouse.

How structured data is stored in data lake? ›

A data lake is a central location that holds a large amount of data in its native, raw format. Compared to a hierarchical data warehouse, which stores data in files or folders, a data lake uses a flat architecture and object storage to store the data.

Why is data lake cheaper than data warehouse? ›

Storage costs are fairly inexpensive in a data lake vs data warehouse. Data lakes are also less time-consuming to manage, which reduces operational costs. Data warehouses cost more than data lakes, and also require more time to manage, resulting in additional operational costs.

What is the difference between Azure Data Lake and blob storage? ›

Azure Blob Storage is a general purpose, scalable object store that is designed for a wide variety of storage scenarios. Azure Data Lake Storage Gen1 is a hyper-scale repository that is optimized for big data analytics workloads. Based on shared secrets - Account Access Keys and Shared Access Signature Keys.

What is a key component of a data lake? ›

Data storage is one of the key components of a Data Lake architecture. A well-architected storage layer should: Be highly scalable and available. Provide low-cost storage.

How do I give access to data lake? ›

  1. Intended audience.
  2. Prerequisites.
  3. Step 1: Create a data analyst user.
  4. Step 2: Add permissions to read AWS CloudTrail logs to the workflow role.
  5. Step 3: Create an Amazon S3 bucket for the data lake.
  6. Step 4: Register an Amazon S3 path.
  7. Step 5: Grant data location permissions.
  8. Step 6: Create a database in the Data Catalog.

Can we create tables in Azure Data Lake? ›

The commands in this article can be used to create or alter an Azure Storage external table in the database from which the command is executed. An Azure Storage external table references data located in Azure Blob Storage, Azure Data Lake Store Gen1, or Azure Data Lake Store Gen2.

What is the difference between SQL database and lake database in Azure? ›

What are the differences between Lake Databases and SQL Serverless databases? At face value, they're very similar. They're both means by which you can query data in your data lake. However, Lake Databases are special in that they're synchronized between the Spark and the SQL Serverless engines in Synapse.

What is the difference between SQL pool and data lake? ›

Serverless SQL pool acts as compute engine while the data lake serves as storage. It uses pay per use model and is always available without any additional cost for reservation (In the case of a Dedicated SQL pool, only pausing helps to cut the cost when it is not used).

What are the zones in a data lake? ›

No two data lakes are built exactly alike. However, there are some key zones through which the general data flows: the ingestion zone, landing zone, processing zone, refined data zone and consumption zone.

How do I access data from AWS data lake? ›

Walkthrough
  1. Create a data lake administrator.
  2. Register an Amazon S3 path.
  3. Create a database.
  4. Grant permissions.
  5. Crawl the data with AWS Glue to create the metadata and table.
  6. Grant access to the table data.
  7. Query the data using Amazon Athena.
  8. Add a new user with restricted access and verify the results.
Aug 12, 2019

How do I connect data lake to excel? ›

In Excel, open the Data tab and choose From Other Sources -> From Microsoft Query. Choose the ADLS DSN. Select the option to use Query Wizard to create/edit queries. In the Query Wizard, expand the node for the table you would like to import into your spreadsheet.

How to retrieve data using query? ›

You can use an asterisk character, *, to retrieve all the columns. In queries where all the data is found in one table, the FROM clause is where we specify the name of the table from which to retrieve rows. In other articles we will use it to retrieve rows from multiple tables.

How do you connect to a data lake? ›

Connect to Azure Data Lake Storage
  1. Go to Data > Data sources.
  2. Select Add a data source.
  3. Select Azure Data Lake Storage Gen 2.
  4. Enter a Data source name and an optional Description. ...
  5. Choose one of the following options for Connect your storage using.
Apr 10, 2023

How do I access data in Azure Data Lake? ›

  1. Step 1: Create an Azure service principal. ...
  2. Step 2: Create a client secret for your service principal. ...
  3. Step 3: Grant the service principal access to Azure Data Lake Storage Gen2. ...
  4. Step 4: Add the client secret to Azure Key Vault. ...
  5. Step 5: Create Azure Key Vault-backed secret scope in your Azure Databricks workspace.
Mar 16, 2023

How to read data from Azure Data Lake using Python? ›

  1. You can use the Azure identity client library for Python to authenticate your application with Azure AD. ...
  2. To use a shared access signature (SAS) token, provide the token as a string and initialize a DataLakeServiceClient object. ...
  3. You can authorize access to data using your account access keys (Shared Key).
Feb 22, 2023

What is AWS equivalent of Azure Data Lake? ›

Azure Data Lake is the competitor to AWS Cloud Formation. As with AWS, Azure Data Lake is centered around its storage capacity, with Azure blob storage being the equivalent to Amazon S3 storage.

How do I connect dynamics to Excel? ›

Connecting Excel to Dynamics 365 with Microsoft Query
  1. Start Excel, click the Data tab.
  2. In the appeared ribbon, click From Other Sources, and then click From Microsoft Query.
  3. In the next dialog, choose the data source you want to connect to (e.g., using data source name - Devart ODBC Dynamics 365).

How do I convert CSV to Excel in Azure data Factory? ›

Using Excel packager API with Azure Data Factory
  1. Step 1: Create one csv file in blob storage for each tab with path of /EXCELFILENAME/SheetName.csv.
  2. Step 2: Call the API with the above details using Web or Webhook blocks.
  3. Step 3: Read and/or move the excel file to intended destination.
  4. Step 4: Cleanup files.

How do I use data Lake Azure? ›

Create a Data Lake Analytics account
  1. Sign on to the Azure portal.
  2. Select Create a resource, and in the search at the top of the page enter Data Lake Analytics.
  3. Select values for the following items: ...
  4. Optionally, select a pricing tier for your Data Lake Analytics account.
  5. Select Create.
Jan 13, 2023

Which query is used to retrieve all records from the database? ›

The SQL SELECT Statement

The SELECT statement is used to select data from a database. The data returned is stored in a result table, called the result-set.

Which SQL command is used to retrieve data? ›

Explanation: In database SELECT query is used to retrieve data from a table. It is the most used SQL query.

Which option is used to retrieve data? ›

DML is abbreviation of Data Manipulation Language. It is used to retrieve, store, modify, delete, insert and update data in database.

Where is data stored in data lake? ›

A data lake is a centralized repository designed to store, process, and secure large amounts of structured, semistructured, and unstructured data. It can store data in its native format and process any variety of it, ignoring size limits. Learn more about modernizing your data lake on Google Cloud.

What data types are in a data lake? ›

A data lake can include structured data from relational databases (rows and columns), semi-structured data (CSV, logs, XML, JSON), unstructured data (emails, documents, PDFs) and binary data (images, audio, video).

Can a data lake be a database? ›

You might be wondering, "Is a data lake a database?" A data lake is a repository for data stored in a variety of ways including databases. With modern tools and technologies, a data lake can also form the storage layer of a database.

Videos

1. Planning for Life after Dynamics GP
(eOne Solutions)
2. Your Popdock Suite for Dynamics GP
(eOne Solutions)
3. eOne Office Hours: Using Popdock for Data Migration
(eOne Solutions)
4. Introduction to Dynamics SL to D365 Business Central Migration
(eOne Solutions)
5. SmartLists in the Cloud
(eOne Solutions)
6. Virtual Integration With Popdock From eOne
(Encore Business Solutions)
Top Articles
Latest Posts
Article information

Author: Nathanial Hackett

Last Updated: 06/20/2023

Views: 6491

Rating: 4.1 / 5 (72 voted)

Reviews: 95% of readers found this page helpful

Author information

Name: Nathanial Hackett

Birthday: 1997-10-09

Address: Apt. 935 264 Abshire Canyon, South Nerissachester, NM 01800

Phone: +9752624861224

Job: Forward Technology Assistant

Hobby: Listening to music, Shopping, Vacation, Baton twirling, Flower arranging, Blacksmithing, Do it yourself

Introduction: My name is Nathanial Hackett, I am a lovely, curious, smiling, lively, thoughtful, courageous, lively person who loves writing and wants to share my knowledge and understanding with you.