Informatica MDM | What Is Data Flow, Delta Detection, and Audit Trial Setup?
Posted by Support@InventModel.com Posted by Apr 25, 2025 in Informatica MDM Interview Questions and Answers
Yeah, hello, everyone. In this session, we will try to focus on Informatica MDM Data Flow. Here, we can see the different assets available within this Informatica MDM Data Flow. So, our focus today will be on how Informatica MDM Data Flow happens.
Landing Table
First thing is the Landing table. Here is the starting point of Informatica MDM.
Okay, here we can collect multiple sources of data. And we can separate our data using a staging table.
We can create a source system according to the source system, segregate our data, and store our golden record in a base object table. Okay, so if you want to create a staging table, first, you need to create a base object table. A staging table is a subset of a base object table.
Yeah. So if you are creating a staging table, first, you need to create a base object table, then you can create a staging table.
So we can say that the landing table is the starting point where we are getting the data and storing data from the different sources,
staging table
That is also the subset table at the base of that table. And Here we are getting the data from the cleaning, validation, and standardization. And we remove that duplicate data. And in the base Object table, we are getting just the final master date.
Mapping
So, another thing is the most important. That is the mapping, So when data moves from the landing table to the stage table. So we have to map that data so it should be filtered or it should be cleaned by some function. So we are getting clean data. And through this flow. And we are getting data from the landing table to the staging table.
Minimize image
Edit image
Delete image
Informatica MDM Mapping
So learning table stores raw data from multiple sources without changes. But in the staging table store, clean and process the data, ready for loading. So in the staging table, just, we are just getting our clean or processed data.
Another thing is important.
Delta Detection and Audit Trial Setup
Delta detection means we are identifying and processing only the change data instead of the full load, like we have multiple datasets. But we want only the updated data. So we can use that data detection setup. So, where it comes when do we declare the stage in the table? So, in that there is setting. And we can apply the enable delta detection option. Or there are the 3 options, like detecting data by comparing all columns in the mapping. So we can compare all columns in mapping, and another one
detects data to a date column like it. If the dates change or something, it should be, be updated. So it should be loaded.
So we can also choose that option. And the 3rd last one is the detection of Delta using a specific column, like this column will be changed or updated. So we can. Also consider that data. And we can load that data for loading the updated data or for delta detection.
Yes, it will store all the table data for comparison to the landing table and to staging table. It will store all the data for comparison.
Yeah, period like previous load. So it just compared the previous load and the current load. And only load that data. That is the updated one.
In the stage table
And another one is the audit trial. So in audit trial, what we are doing we are maintaining a record of data changes, ensuring compliance and traceability. Like, we want to trace our data, we want to store historical data. So at that time, we are using that audit trail. So to help maintain history.
Yeah, it may enhance in maintenance history. And there is the option, like a preserve and body trial on the road table. Like, how many loads do we want from the historical data? So it will only load that data.
So we want to load data. So it will store 2 historical data points for that same day in detail.
Enable audit trail. The one table is populated, raw table will be populated. It will store all the history, historical data.
So, another thing is most important. That is, the clean function. So cleanse function in Informatica MDM helps in cleaning and standardizing our data before loading it into the stage table.
cleanse function
Cleaning function, there are multiple cleaning functions, like, mostly at times we are using the string function for some cleaning of our data, like when the data comes from different sources. So sometime it should sometimes it will take the numeric.
Minimize image
Edit image
Delete image
Informatica MDM Cleanse Functions
We can use multiple things here, we can, Here we have, we have multiple functions.
Continent Regular expression Beaconians for meat.
Yeah, the service team also. And also I want to explain that when the name comes with the 1, 2, 3, or something numeric value. So it will also, we can also use the service team that strings
from that cleans function like strings, underscore service string.
load process
The load process is the process where we load the data, where we transfer our data staging table to the base object table. It will ensure that only the value will have clean data. Master data will be stored in this object Tables.
We are just transferring our data from the stage table to the piece object table. And here, at the base of that table, as I already said that we are starting the final master data in that table. So here the data comes in the form of clean and validated, and standardized data. Okay? So, as I explained, through this picture, we are getting the data from the landing table and then the staging table. And then the base object table.
✅ Conclusion
Informatica MDM gives a powerful and structured data management flow so that, ensures data is accurate, clean, and reliable for across systems. Starts from the Landing Table, which captures raw data from different-different source systems. This data is then processed through Staging Tables, where cleansing, validation, and standardization take place. From there, it moves into the Base Object Table, which carrys the trusted, golden records.
Using cleansing functions, mappings, and structured load processes, Informatica MDM Confirms that data entering the system is consistent, high-quality, and master-ready. This flow not only enhances data integrity but also plays a important role in driving confident decision-making for businesses.
If you want to accelerate your career in the field of data, this is the right time to start. Join our Informatica MDM on Premises Training
Ready to take your security to the next level? Let’s talk!