Cannot make it to work

Aug 29, 2011 at 10:07 AM
Edited Aug 29, 2011 at 10:15 AM

Hi,

I'm trying to do simple exports of my databases to my blob storage.

1) For the SQLDacCli 1.2 :

Here is the display of my command :

Server has been mapped to North Europe datacenter.
Arguments validation completed.
Export started: 29/08/2011 11:59:13
Operation Response: 0ce4c7db-fd36-45ec-b8ff-428b0bd8d09d
Export Submit Complete.  Total time: 00:00:15.8435353

 

Seems right but I don't see any bacbac file in my blob storage.

When I try to get the status of my exports, the status retrieval simply blocks.

2) For the 1.1 :

The exports always blocks at :

Connection Open.
Export started: 29/08/2011 12:12:53
[12:12:57] ExtractDac: Pending Extracting Database

Any clue ?

Thanks.

Aug 29, 2011 at 2:32 PM

Got it !

I used -U username instead of -U username@servername

Then saw that I had errors of "Invalid length for a Base-64 char array."

After that, I put quotes around my storage key and my blob URL, it seems to work now.

Thanks.

Aug 29, 2011 at 4:25 PM

Yes, sometimes you have to quote your keys, it depends on the characters that are in them.  As a best practice it never hurts to quote them, the quotes will be stripped off and ignored.  It is just for the command line parser to get all of your key.

You should not have to use username@servername either.  Unless Sql Azure had made that a requirement now.  I do it all the time with just the user.

 

Coordinator
Aug 29, 2011 at 5:13 PM

Thanks rodbeck, I added a note to the release page.

Aug 31, 2011 at 10:09 PM

I am still having an issue to make basic exporting work:

Command:
DacIESvcCli -S po8xxxxxxxxx.database.windows.net -U bcxxxxxxxxxx -P xxxxxxxxx -D SRC -X -BLOBACCESSKEY iXXXXXXXXXXXXXXXXXXXXLUXXXXXXXXXXVk5DoccptqYUhmP7IgNPONLQgHS8xdP8FlM00bvOSWht9Fhg== -BLOBURL http://MYURL.blob.core.windows.net/databasepoc/mySRC.bacpac -ACCESSKEYTYPE SHARED

An error has occurred:
The type initializer for 'DacIEServiceCli.DataCenterMapper' threw an exception.
An error occurred creating the configuration section handler for DataCenterMapping: Request failed. (C:\Users\v-shmeht\Desktop\SQLAzureImportExportUtilities\BACPAC v1.2\DacIESvcCli.exe.Config line 25)

Looking @ the config file, Line 25 has NORTH CENTRAL data center location.
      <add dns="CH1-1" name="North Central US" endpoint="https://ch1prod-dacsvc.azure.com/DACWebService.svc"/>

BTW: I can't get the .SVC url to work for either of the data center service loations.

Aug 31, 2011 at 10:22 PM

An error has occurred:
The type initializer for 'DacIEServiceCli.DataCenterMapper' threw an exception.
An error occurred creating the configuration section handler for DataCenterMapping: Request failed. (C:\Users\v-shmeht\Desktop\SQLAzureImportExportUtilities\BACPAC v1.2\DacIESvcCli.exe.Config line 25)

Sorry you are still having problems.

Line 25 is actually the DataCenterMapping tag.  That makes me think something else is wrong.  That section is for the DataCenterMapping configuration section. 

Are you sure you updated the EXE and the config?  If you grabbed it out of a ZIP, make sure you right clicked it, select properties, and select unblock.  Otherwise it may not have permission to load the type. 

What Operating System is this?

Just to verify, you have the full .Net 4 framework installed on this machine (not the Client Profile)?   

Are you running this from a normal command prompt, or an administrator one?  I don't think you would need admin level, but you could try it just in case.  Launch the command prompt as administrator (you should see Administrator: in the title bar).

Where is your database located?  Is it in a production datacenter (not a private cluster)?

 

Aug 31, 2011 at 10:36 PM

Hello Jason,

That moved me past the initial issue.

There is a new issue that I need to troubleshoot on the SQL Storage Blog Storage Setup side.

When I check the status with –status command:
Blob http://azurexxxxxxxxxxtest.blob.core.windows.net/databasepoc/mySRC.bacpacixxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx== is not writeable.

I have answered your questions INLINE below.

Thanks,

Shital Mehta

From: JasonShort [email removed]
Sent: Wednesday, August 31, 2011 3:23 PM
To: ShitalKumar Mehta (Synergy Technologies LLC)
Subject: Re: Cannot make it to work [sqldacexamples:270751]

From: JasonShort

An error has occurred:
The type initializer for 'DacIEServiceCli.DataCenterMapper' threw an exception.
An error occurred creating the configuration section handler for DataCenterMapping: Request failed. (C:\Users\v-shmeht\Desktop\SQLAzureImportExportUtilities\BACPAC v1.2\DacIESvcCli.exe.Config line 25)

Sorry you are still having problems.

Line 25 is actually the DataCenterMapping tag. That makes me think something else is wrong. That section is for the DataCenterMapping configuration section.

Are you sure you updated the EXE and the config? If you grabbed it out of a ZIP, make sure you right clicked it, select properties, and select unblock. Otherwise it may not have permission to load the type.
[SHITAL MEHTA] What you suggested got me past the original issue.

Server has been mapped to North Central US datacenter.
Arguments validation completed.
Export started: 8/31/2011 3:25:21 PM
Operation Response: df877953-fda6-4b18-b66c-8cb1dbcb32d2
Export Submit Complete. Total time: 00:00:00.9969733

What Operating System is this?
[SHITAL MEHTA] I am on Microsoft Corporate Network. Using Windows 7 64BIT.

Just to verify, you have the full .Net 4 framework installed on this machine (not the Client Profile)?
[SHITAL MEHTA] YES

Are you running this from a normal command prompt, or an administrator one? I don't think you would need admin level, but you could try it just in case. Launch the command prompt as administrator (you should see Administrator: in the title bar).
[SHITAL MEHTA] I used admin command prompt.

Where is your database located? Is it in a production datacenter (not a private cluster)?
[SHITAL MEHTA] It’s in NORTH CENTRAL Production Database.

Aug 31, 2011 at 10:44 PM
When I check the status with –status command:
Blob http://azurexxxxxxxxxxtest.blob.core.windows.net/databasepoc/mySRC.bacpacixxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx== is not writeable.

 

 

You can use the Azure Storage Explorer to view the permissions.  I notice your original key in the command line was flagged as shared.  Make sure it is read/write.  We have to be able to write to the location in order to put the bacpac there.

You can also use the account storage key (primary key) and set the -accesskeytype storage when you are using an account level key.

We don't have a way to validate if the key you gave us is shared or account level, we just try to access using the method you set. 

Keys can be confusing.  But if you are copying the key from the Azure Portal you need to set it to a storage level key.  There is currently no way from the portal to generate a shared access key, so very few people will be using them I think.

Sep 1, 2011 at 6:00 PM

Thanks Jason.

Your suggestions helped me get the Export n Import working.

Helpful thing to know was to know was:

1. To unblock the tool access by Right Clicking on the tool and config file  --> Properties Tab --> Security Section click "Unblock"
        "The file came from another computer and might be blocked to help protect this computer"
2. To have ACCESSKEYTYPE 'Storage' for blob storage.

My Explort has been very slow though. The same database where we took a Database SNAPSHOT [COPY AS] in 40minutes to 1 hour, my export has been working from 5pm 8/31 until 11am 9/11, which is 18 hours.

Here is what I did to check the status.

1. Issued a -STATUS command for the same 
   DacIESvcCli -s sourceserverfromwhereexplortwasstarted -u USERNAME -p PASSWORD -status
2. The status of the Explort was showing "Running".
3. I went back and checked the container inside the BLOBURL using "Azure Storage Explorer". The exported file "MYDATABASE.BACPAC" is showing a 0 KB file.
   combining 2 and 3 I believe the export operation is going on.

Is there a way for me to know whether its not gone in "clouds"? Is there a way I can check the status through some some queries how much is it completed?

Thanks,
Shital Mehta

Sep 1, 2011 at 6:04 PM

My Explort has been very slow though. The same database where we took a Database SNAPSHOT [COPY AS] in 40minutes to 1 hour, my export has been working from 5pm 8/31 until 11am 9/11, which is 18 hours.

So you have already imported it to Sql Azure and are not exporting it again?

On premise against a local Sql Server is way faster because it is just writing to local disk all the pages of the database.  It is not exporting the data.  Remember that you can't take an on premise database to Sql Azure.  So we have to take all the data and schema out to a logical format.

But 18 hours seems extreme to me.  I pinged you about the issue.

 

Sep 2, 2011 at 6:40 PM

For anyone tracking this thread. I am working with Jason and other project members and will update the thread as soon as something is figured out as fas as slow export issue is concerned.