Wednesday, May 01, 2013

Windows Azure - SQL Server Environment - Connection String

I've been going through the Hands on Labs for Windows Azure.  I made it to the Windows Azure Web Sites and Virtual Machines using ASP.NET and SQL Server - for Visual Studio 2012 lab before I encountered an issue. 

The issue was around the SQL Server connection string that is changed in the web.config of the Visual Studio code.  Here is what the lab shows:

<add name="ContactManagerDb" connectionString="Data Source={yourServerAdress},{yourPort};Initial Catalog=ContactManagerDb;User Id={username};Password={password};" providerName="System.Data.SqlClient" />  

My confusion came from the fact that during the lab we worked with 2 ports and 2 usernames.  At first I could not get any combination to work.  I finally figured it out and need to write it down, as I will forget this tomorrow :). 

The server address is the DNS name that can be found on the dashboard of the virtual machine.  The port is the Public Port of the TCP endpoint.  The User Id and Password are the SQL username and password, not the virtual machine username and password.  In this lab that would be "TestUser".

I also opened both the 1433 port and the 57501 Port in the Windows Firewall Rule.  This does not seem to be the issue, as it appears just 1433 works.

Tuesday, February 12, 2013

HL7 Message - "Missing Body Schema"

I've been working with HL7 messages for a few months now.  As anyone who has worked with HL7 knows, each message type can have many different versions - 2.3.1,. 2.4, 2.5.1, and many more.  The question arises as to how to create one process for each message type that can handle all versions.  The correct answer is to create a canonical schema and create a map for each of the versions and put the maps on the receive or send port.

Now lets say you come on to a project that has already been implemented and they are not using canonical schemas.  The next step (after going on and on about why you really should use canonical schemas) is to use the Microsoft.XLANGs.BaseTypes.Any schema.  

Just a little bit of info for those who are new to HL7.  An HL7 messages contain 3 parts.  The MSHSegment or header segment, the BodySegment which obviously contains the body, and a ZSegment which can contain addition information that is not defined in the normal HL7 message but in my experience is usually left blank.  I set the MSH segment to a MSH schema.  The BodySegment is set to the Any schema, and the ZSegment is set to either an XmlDocument or String .NET value.

Using this type of schema will work, however you will run into issues when trying to assemble the HL7 message in the BTAHL72XPipelines.BTAHL72XSendPipeline.  If you get the error message  "Body schema is missing", it does not mean you are missing the the body segment.  It means that the HL7 assembler cannot determine which HL7 schema to use based on the messagetype namespace.  Which, would be set to "". 

The following code can be used if calling the BTAHL72XPipelines from an orchestration.  To call this code from a pipeline component you would only need to call the setContext method.  For reference, XLANGMessage is a message that is used within the orchestrations and does not allow you to set the MessageType context property.  IBaseMessage is used within the pipeline/pipeline components and allows you to set the MessageType context property.  There is no direct way to create one type from another.

HL7SetContext(XLANGMessage hl7Message, string messageType)
            if (hl7Message == null || hl7Message.Count == 0)
                return string.Empty;

            Stream stream = null;

            //Create an IBaseMessage
            IBaseMessageFactory _factory = new MessageFactory();
            IBaseMessage inputMessage = _factory.CreateMessage();
            inputMessage.Context = _factory.CreateMessageContext();

            string partName = string.Empty;

           //Get the XLANG Parts and add them to the IBaseMessage
            for (int whichPart = 0; whichPart < hl7Message.Count; whichPart++)
                partName = hl7Message[whichPart].Name;

                stream = null;
                stream = hl7Message[whichPart].RetrieveAs(typeof(Stream)) as Stream;
                IBaseMessagePart part = new MessageFactory().CreateMessagePart();
                part.Data = stream;

               inputMessage.AddPart(partName, part, string.Compare(partName, "BodySegments", true) == 0);

 //Very Important Step, do not leave this out
inputMessage = setContext(inputMessage, messageType);

//Create the Pipeline
SendPipelineWrapper sendPipeline = Winterdom.BizTalk.PipelineTesting.PipelineFactory.CreateSendPipeline(typeof(BTAHL72XPipelines.BTAHL72XSendPipeline));


            IBaseMessage ret = null;

                ret = sendPipeline.Execute(inputMessage);
            catch (Exception ex)
                throw ex;

    //TODO:  Add your code using the messages in ret

 //The values below are shown as empty strings as examples, but when coding must contain a value

IBaseMessage setContext(IBaseMessage inputMessage, string messageType)
inputMessage.Context.Promote("MessageType", "", messageType);
//using a custom HL7 party
inputMessage.Context.Promote("MSH3_1", "http://HL7Schemas.HeaderPropertySchemas", "");        inputMessage.Context.Promote("MSH5_1", "http://HL7Schemas.HeaderPropertySchema", "");  inputMessage.Context.Promote("IsStaticAck", "http://HL7Schemas.HeaderPropertySchema", false);

inputMessage.Context.Promote("ZPartPresent", "http://HL7Schemas.HeaderPropertySchema", false);
inputMessage.Context.Promote("SPName", "", "XXX");
inputMessage.Context.Promote("MessageEncoding", "http://HL7Schemas.HeaderPropertySchema", 65001);
 inputMessage.Context.Promote("MSH1","http://HL7Schemas.HeaderPropertySchema", "124");
inputMessage.Context.Promote("MSH2","http://HL7Schemas.HeaderPropertySchema", "^~\\&");
inputMessage.Context.Promote("SegmentDelimiter2Char","http://HL7Schemas.HeaderPropertySchema", true;); 

return inputMessage;

Monday, May 09, 2011

Migrating SQL Server 2008 R2 database to SQL Azure

I was tasked with taking a pre-existing database and migrating it to SQL Azure.  The reason I needed a SQL Azure database was because I am working on a project that has 2 developers, one located in Chicago and one (me) in Indy, and we needed to make sure we were using the same database at all times during development.  And no, we do not have a network.

Thanks to the fact that all tables must have clustered indexes, it took me quite a bit longer to migrate the data of the database than I had anticipated.  To help others, and also to remind myself, I am posting the steps I took below.

First, here is the error I got when I tried to use the Data Export method in SQL Server 2008 R2:
Error 0xc020844b: Data Flow Task 1: An exception has occurred during data insertion, the message returned from the provider is: Tables without a clustered index are not supported in this version of SQL Server. Please create a clustered index and try again.  (SQL Server Import and Export Wizard)
Since the database was already created I did not want to go through all 60 some tables and change the indexes.  The following steps are what I had to do to get the data into the SQL Azure database.

1) Create a SQL Azure Server/Database.
a. I'm not going to go into details on this, for more info see:
2) Generate Database Script.
a. Open SQL Server 2008 R2 and open your on-premise database.  Right click on your 2008 database and select Tasks->Generate Scripts
b. On the Introduction Screen, select Next.
c. On the Choose Objects screen select “Select specified objects” and then select the objects you want to migrate to SQL Azure.
d. On the Set Scripting Options screen, press the Advanced Button.
e. Under the General Options, select “Script for the database engine type” and select “SQL Azure Database”
f. Under the Table/View Options, select “Script Indexes” and change the value to “True”
g. Select OK.
h. On the Set Scripting Options screen, change the output to be “Save to New Query window”.
i. Press Next.
j. On the summary screen press Next.
k. Make sure all object were successful. Press Finish.
3) Create empty tables in SQL Azure.
a. In SQL Server 2008 R2 Management Studio, select the query that was created in step 2.
b. Right click in the query window and select Connection->Change connection.
c. The SQL Server connection window will appear. Connect to your SQL Azure server. Your login is the <username>@<SQL Azure Server Fully Qualified DNS Name>
d. Execute the query
4) Turn off Foreign Keys and Indexes.
a. In order to avoid the error mentioned above and to allow data to be entered in any order, the foreign keys and indexes must be removed.
b. There are stored procedures that can be run to achieve this.
c. In SQL Server Management Studio, connect to you SQL Azure database.
d. Open a new Query and paste in the following code to create the stored procedures:
IF OBJECT_ID('[dbo].[SetForeignKeyEnabledStatus]','P') IS NOT NULL
DROP PROCEDURE [dbo].[SetForeignKeyEnabledStatus];
CREATE PROCEDURE [dbo].[SetForeignKeyEnabledStatus]
@enabled bit
@schema_name sysname,
@table_name sysname,
@fk_constraint_name sysname,
@cmd_txt varchar(8000);
SELECT AS schema_name, AS table_name, AS fk_constraint_name
sys.objects so
inner join sys.schemas sc ON so.schema_id = sc.schema_id
inner join sys.foreign_key_columns fk ON so.object_id = fk.parent_object_id
inner join sys.objects so2 ON so2.object_id = fk.constraint_object_id
so.type = 'U';
OPEN fk_cursor;
IF @enabled = 0
SET @cmd_txt = 'ALTER TABLE [' + @schema_name + '].[' + @table_name + '] NOCHECK CONSTRAINT [' + @fk_constraint_name + '];'
SET @cmd_txt = 'ALTER TABLE [' + @schema_name + '].[' + @table_name + '] WITH CHECK CHECK CONSTRAINT [' + @fk_constraint_name + '];';
PRINT @cmd_txt;
EXECUTE (@cmd_txt);
CLOSE fk_cursor;
DEALLOCATE fk_cursor;
IF OBJECT_ID('[dbo].[SetIndexEnabledStatus]','P') IS NOT NULL
DROP PROCEDURE [dbo].[SetIndexEnabledStatus];
CREATE PROCEDURE [dbo].[SetIndexEnabledStatus]
@enabled bit
@schema_name sysname,
@table_name sysname,
@index_name sysname,
@cmd_txt varchar(8000);
SELECT AS schema_name, AS table_name, AS index_name
sys.objects so
inner join sys.schemas sc ON so.schema_id = sc.schema_id
inner join sys.indexes si ON so.object_id = si.object_id
so.type = 'U'
AND si.index_id > 1;
OPEN idx_cursor;
IF @enabled = 0
SET @cmd_txt = 'ALTER INDEX [' + @index_name + '] ON [' + @schema_name + '].[' + @table_name + '] DISABLE;'
SET @cmd_txt = 'ALTER INDEX [' + @index_name + '] ON [' + @schema_name + '].[' + @table_name + '] REBUILD;'
PRINT @cmd_txt;
EXECUTE (@cmd_txt);
CLOSE idx_cursor;
DEALLOCATE idx_cursor;
e. Execute the query.
f. Open the Stored Procedures objects and verify that the 2 stored procedures exists
g. In a new Query window on the SQL Azure database, run the following statements:
            i. EXEC SetForeignKeyEnabledStatus 0
                      ii. EXEC SetIndexEnabledStatus 0
5) Export Data to SQL Azure
a. Open your on-premise SQL Server 2008 R2 database. Right click and select Tasks->Export Data
b. On the Choose a Data Source screen, select the on-premise database and then select Next:
c. On the choose destination, select “.Net Framework Data Provider for SqlServer” from the Destination drop down box.
d. Under the Security section, change Encrypt to “True”
e. Enter your password
f. Change TrustServerCertificate to “True”
g. Enter your username as <username>@<sql azure server fully qualified DNS name>
h. Under Source, set the Data Source to the <SQL Azure Sever Fully Qualified DNS Name>
i. Enter the database in the Initial Catalog. The available databases should be populated in the drop down box.
j. Select Next
k. On the Specify Table Copy or Query, Select Copy Data from one or more tables or views and press Next.
l. On the Select Source tables and views screen, select the tables and views to copy. Press Next.
m. On the Save and Run package select Run Immediately. Press Finish.
n. Once the process has run, make sure no error occurred.
            i. I did get the following error the first time I ran the process:
Warning 0x80049304: Data Flow Task 1: Warning: Could not open global shared memory to communicate with performance DLL; data flow performance counters are not available.  To resolve, run this package as an administrator, or on the system's console.
                      To fix, I closed all connections to my SQL Azure database, except the export wizard.

6) Turn indexes and foreign keys back on.
a. Run the following stored procedures to turn the indexes and add the foreign keys back:
          i. EXEC SetIndexEnabledStatus 1
          ii. EXEC SetForeignKeyEnabledStatus 1
7) Add User/Login
a. While not part of adding data, the last step was needed to make sure both developers had access.
b. Open a new Query window in the SQL Server Management Studio, pointing to the SQL Azure master database.
c. Copy the following code, changing the username and password values.
                    CREATE LOGIN <username>
                          WITH PASSWORD = '<password>'
d. Open a new Query window in the SQL Server Management Studio, pointing to the SQL Azure database.
                        CREATE USER <username>
                         FOR LOGIN <username>
                       WITH DEFAULT_SCHEMA = dbo
                          -- Add user to the database owner role
                    EXEC sp_addrolemember N'db_owner', N'<username>'

Thursday, January 20, 2011

MPS Partners BizTalk presentations/whitepapers

My company took quite a look at BizTalk and the cloud (Azure).  My manager, Chris Kabat, and I put together a demo for Tech Ed 2010 that shows how BizTalk can be used to connect SAP with a web application using Azure:

Chris also did a presentation for Channel 9 on using Windows Azure:

Lastly, Chris and another co-worker, Naresh Koka, created a whitepaper on SAP and BizTalk that can be found here: 

Its been a busy year for us - though mainly for Chris :).

Saturday, June 05, 2010

SSO Error - Password Change

I was reminded today - again - of why you should never change your password that runs the BizTalk servers.  My Tech Ed demo is running in a VPC and the other day the password of the account on the VPC (yes I use only one for VPCs) expired and I changed the account.  Well, I was doing a run through today and my receive locations would not start (Panic!).  They were giving me this error:

Cannot perform encryption or decryption because the secret is not available from the master secret server.

The master secret is on the computer (I verified) and the SSO Service is started - so what was the problem?  First let me state, I did go back and change my password back to the original password, but that did not solve the issue.  I opened the SSO Administration console and noticed that under servers - there were no servers listed.  I added my computer, then in the SSO Admin console, I selected the server and did a stop then a start.  I was then able to Enable my receive locations.

Oh, this was on a BizTalk 2010 beta installation - but in general it is a bad idea to change you password when running BizTalk.