|
SoftTree Technologies
Technical Support Forums
|
|
Author |
Message |
gemisigo
Joined: 11 Mar 2010 Posts: 2141
|
|
[12.4.476 Pro] - Data Transfer issue |
|
When trying to transfer data from one server to another I've run into the following error:
|
|
2023-12-20 02:45:31 Querying source table TD_ONE_DATA.dbo.jarmu_kieg. Please wait...
2023-12-20 02:45:31 Table [TX_ONE_DATA].[dbo].[jarmu_kieg] truncated.
2023-12-20 02:45:31 9688 records fetched from table TD_ONE_DATA.dbo.jarmu_kieg
2023-12-20 02:45:31 Error bulk inserting records. ExecuteNonQuery: .NET Exception: The given value of type String from the data source cannot be converted to type decimal of the specified target column;
INSERT INTO [TX_ONE_DATA].[dbo].[jarmu_kieg] ([jarmu_id],[ferohely],[ulohely],[statferohely],[statusz],[fogy_norma],[Datum_tol],[Datum_ig],[KK_id],[change_timestamp],[etl_timestamp]) VALUES (@b0,@b1,@b2,@b3,@b4,@b5,@b6,@b7,@b8,@b9,@b10)
2023-12-20 02:45:31 Inserted 0 out of 9688 rows ( 0% ).
2023-12-20 02:45:31 No records inserted. Processing of table stopped on error.
|
The structure of the table is the following:
|
|
CREATE TABLE [dbo].[jarmu_kieg](
[jarmu_id] [int] NOT NULL,
[ferohely] [int] NULL,
[ulohely] [int] NULL,
[statferohely] [int] NULL,
[statusz] [varchar](5) NULL,
[fogy_norma] [decimal](5, 3) NULL,
[Datum_tol] [smalldatetime] NOT NULL,
[Datum_ig] [smalldatetime] NULL,
[KK_id] [int] NOT NULL,
[change_timestamp] [datetime] NOT NULL,
[etl_timestamp] [datetime] NULL,
CONSTRAINT [PK_Jarmu_kieg] PRIMARY KEY CLUSTERED
(
[jarmu_id] ASC,
[Datum_tol] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
|
The error complains about converting strings to decimals, so I guess the issue might be related to the Hungarian language using decimal commas instead of decimal points. When saving the data of that table to a script instead of transferring it directly, the saved script loads the data without any problems.
By the way, the exported script already uses USE to make the target database the current one, so prefixing the CREATE TABLE, the TRUNCATE TABLE, and all the INSERT INTO statements with the database is redundant, wastes space (sometimes massively), and is a nightmare in case one has to modify the target database. It has to be changed/removed for each insert individually. The Data Transfer tool also does not have the option to save the inserts in batches, so each of the rows is a separate insert (though in some cases this has its advantages).
|
|
Tue Dec 19, 2023 9:58 pm |
|
|
SysOp
Site Admin
Joined: 26 Nov 2006 Posts: 7907
|
|
|
|
The error message indicates that the selected method for data insertion is "bulk INSERT", which means it should be sending data in binary format. The error message seems to indicate that the error occurs while preparing decimal data values from fogy_norma column. Might be something regional settings specific, I didn't find yet where to specify decimal points instead of decimal commas, most tools support that option.
May I ask you to try the same transfer using a different data insert method? Because the scripted version worked, I assume that multi-row insert should work too. Please try changing "bulk INSERT" method to "Multiple row INSERT", which is similar to scripted version and almost as fast as bulk insert.
|
|
Wed Dec 20, 2023 1:31 am |
|
|
gemisigo
Joined: 11 Mar 2010 Posts: 2141
|
|
Re: [12.4.476 Pro] - Data Transfer issue |
|
|
|
The Data Transfer tool also does not have the option to save the inserts in batches, so each of the rows is a separate insert (though in some cases this has its advantages). |
I apologize for the accusation. Take it as my excuse that my RSD (right-side-display) is a worthless piece of sh!t and I didn't even see that there's a tabbed control right in front of my eyes. The comparison of what I see on the two displays (sorry for the bad-quality photos).
LSD:
RSD:
On RSD, all the colors are washed out, the control frames are sort of invisible, etc. Nonetheless, I should have noticed that the words Project and Options were not at the same level. But I didn't.
I clicked the tab Option, and there they were, the options that I thought were missing.
Now the transfer of the complete set of the tables I selected can be run successfully without any errors, but I wanted to save that as well so that I have to send the data sightseeing only once. I'll have to repeat the data load quite a few times and it takes a while to push them through the wires. Alas, my tests show that those options only affect the direct data transfer, the saved script is the same, regardless of the state of those options, so part of that accusation still stands (the option is there, but unexpectedly it doesn't work when saving the script).
I know I can export it at the destination and save it there but this could have done it in a single step.
|
|
Wed Dec 20, 2023 4:05 am |
|
|
SysOp
Site Admin
Joined: 26 Nov 2006 Posts: 7907
|
|
|
|
Thank you for the detailed description, no need to apologize. I tried reproducing the issue using SQL session language settings, local computer regional settings, but so far unsuccessful, it might be a combination of driver versions installed, and server settings, or a difference between them. Mine are all US_English, which isn't helping in this case. I will create a support case for this one and ask for help with troubleshooting, we have test systems in other geographical regions that might help us with troubleshooting.
|
|
Wed Dec 20, 2023 10:05 am |
|
|
gemisigo
Joined: 11 Mar 2010 Posts: 2141
|
|
|
|
The generated script does not take into account that some tables might have identity columns, and when they have, the script fails.
|
|
Thu Dec 21, 2023 5:45 am |
|
|
SysOp
Site Admin
Joined: 26 Nov 2006 Posts: 7907
|
|
|
|
I believe that's by design. In most cases the existing values need to transferred as is in order not to break referential integrity and business references to existing primary keys. It also doesn't know in which database the generated script is going to be executed, could be used in a very different setup, the Data Transfer tool does support cross database (database type) data transfers, for example from SQL Server to SQLite, or Postgres or something else).
Manually adding SET IDENTITY_INSERT ON/OFF might be required in some scenarios when executing the script manually. The Data Transfer UI version does that automatically when preparing the destination tables.
|
|
Thu Dec 21, 2023 9:32 am |
|
|
SysOp
Site Admin
Joined: 26 Nov 2006 Posts: 7907
|
|
|
|
Forgot to mention that unlike scripted inserts, the UI can optionally disable triggers, and also constraints when inserting the data, and reenable them after data transfer.
|
|
Thu Dec 21, 2023 9:35 am |
|
|
gemisigo
Joined: 11 Mar 2010 Posts: 2141
|
|
|
|
|
|
I believe that's by design. In most cases the existing values need to transferred as is in order not to break referential integrity and business references to existing primary keys. |
I don't understand. How does NOT adding SET IDENTITY_INSERT ON/OFF help in not breaking the referential integrity? Other than the fact that the data won't get into the database, I mean. If I look at it that way, it has no chance of breaking anything at all, indeed.
|
|
It also doesn't know in which database the generated script is going to be executed, could be used in a very different setup. |
Yes, that's true, but the data transfer does not work without picking both a source and a destination (neither does saving the script), and it's more likely than not that the destination will be the one picked for this process, even if the data is only going to be saved to a script for the time being. For now, the saved script does not work on the chosen target. I understand that the UI version can check for the currently present identity columns and various constraints and can enable/disable them on-the-fly, and creating logic in SQL capable of this might reach complexity which makes it not worth it (or plain impossible), so it could only plan for an existing environment. But I think saving a static script that actually works on the intended target instead of having to manually edit the script to resolve issues that could have been addressed at the time the script was saved still makes more sense than not addressing those issues. Manually editing a potentially multi-gigabyte/terabyte script might not always be possible (as in my case). It can also be very time-consuming.
|
|
Manually adding SET IDENTITY_INSERT ON/OFF might be required in some scenarios when executing the script manually. |
I cannot. There's no editor on the target machine capable of opening that file without freezing and the file is already at its destination. I would have to copy it back, find an appropriate editor, change the file, hope I didn't miss anything, and then copy back, only to be able to test that theory of "getting it right on the first try". Rinse and repeat for each mistake. I chose to run the Data Transfer with settings that didn't fail for those naughty decimals, as it was less "painful". I hate regional settings with passion.
|
|
Forgot to mention that unlike scripted inserts, the UI can optionally disable triggers, and also constraints when inserting the data, and reenable them after data transfer. |
Thanks, those can speed up things considerably. Now, if those statements disabling/enabling triggers/constraints before/after the inserts could somehow make their way into the scripts...
Okay, I'm sorry, I know I'm insatiable. But I my mind always kicks in when I see unused potential. And here I see plenty.
|
|
Thu Dec 21, 2023 2:11 pm |
|
|
|
|
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum
|
|
|