If you come across some of the following issues while importing, this page will help you figure them out. To find the solution for the top import issues, refer to the article on the [Imports FAQ](🔗).
You shouldn't run daily imports for all events meaning that you import the same events each day as well as new ones as it can cause issues with automation flows and such events are still calculated in MPE. Learn more about how to optimize your Engagement usage [here](🔗).
While importing, make sure that you select the right columns for the import. If you miss a column or mark a column extra that is not required, you will have to go back into the import process to rectify it.
The corresponding attribute tag should be assigned to import data. Select the attribute checkbox after adding the column and the data type to import the column as an attribute.
While importing customers' properties or events of some customers, you need to link them to the Customer ID to choose the right customer profile for import. However, during importing, do not select the checkbox for the Customer ID column in the import table. This will import the ID as an attribute, and this can cause multiple implications for your project, like duplication of information. Duplication can lead to the issue of volume for large volumes of data.
Assign the correct data type to the column while importing data. For example, importing a timestamp as an integer might cause issues in the final display. The timestamp must be imported in the Date type format.
While [importing catalogs](🔗), set the right columns searchable. If the columns are not searchable, you might face difficulty while creating recommendations because the data will not be indexed in the backend. Note that searchable columns can be assigned only while creating the imports. Once the catalog structure is created, you cannot change it. You can only add more items to the catalog.
When you set a repeated import and it gets stuck, you can edit the repeated import setting and save it again without changing anything. This will kick start the repeated imports.
Events with future timestamps will be processed and only be visible after the future time set has passed. Tampering with the event could create duplicates and create unnecessary load on the platform due to it being not visible.
In order to work with and have the data visible today, you must import the data again, but with a valid timestamp. When you make this change, and the import runs successfully, the data should appear in the platform without problems.
Imports are processed on first-come, first-served orders. Sometimes, there will be a delay between the importing and visibility on the UI. This can happen due to larger imports being queued in the system, which might take up more system resources.
It is recommended that timestamps be imported in the Unix timestamp format as it is flexible.
Make sure that there are no misspelled ID values while importing. Misspelled ID values will create duplicate customer profiles in the system.
Most common import source issues occur when the files used for imports have some errors or missing elements, which will cause errors in importing. Do a basic check to make sure that everything is in place before the import.
If your import exceeds the known import limits, contact the Product team to know if it can be accommodated.
When there's an error, only the rows with an error will not be imported. The rest of the file will be.
Did this article help you?
Please provide your feedback. We would like to know if our help center is effective in solving your queries. You can also leave comments and suggestions on how we can make our help articles better. You can also suggest topics you’d like us to cover.