3

We have a BizTalk 2010 receive location, which will get a 70MB file and then using inbound map (in receive location) and outbound map (in send port) to produce a 1GB file.

While performing the above process, a lot of disk I/O resource is consumed in SQL Server. Another receive location processes performance are highly affected.

We have tried reduce the maximum disk I/O threads in host instance of that receive location, but it still consumes a lot of disk I/O resource in SQL Server.

In fact this process priority is very very low. Is there any method to reduce the disk I/O resource usage of this process such that other processes performance can be normal?

2
  • 1
    Have you tried to create separate host for this particular Receive Port?
    – kletnoe
    Commented Sep 26, 2014 at 6:10
  • Yes, I have tried creating separate host for this particular Receive Location and also set the maximum disk I/O thread lower for this separate host. However, it still use a lot of SQL Server disk I/O during getting file into the messagebox and all other receivelocation's file receiving performances are affected.
    – hosir
    Commented Sep 26, 2014 at 13:34

2 Answers 2

1

This issue isn't related to the speed of the file input, but, as you mentioned in a comment, to the load this places on the messagebox when trying to persist the 1gb map output to the MessageBox. You have a few options here to try to minimize the impact this will have on other processes:

  1. Adjust the throttling settings on the newly created host to something very low. This may or may not work the way you want it to though.
  2. Set a service window on the recieve location for these files so that they only run during off hours. This would be ideal if you don't have 24/7 demand on the MessageBox and can afford to have slow response time in the middle of the night (say 2-3am)
  3. If your requirements can handle this, don't map the file in the recieve port, but instead route it to an Orchestration and/or custom pipeline component that will split it into smaller pieces and then map the smaller pieces. This should at least give you more fine grained control over the speed at which these are processed (have a delay shape in the loop that processes the pieces). There'd still possibly be issues when you joined them back together, but it shouldn't be as bad as your current process.

It also may be worth looking at your map. If there are lots of slow/processor heavy calls you might be able to refactor it.

0

Ideally you should debatch the file. Apply business logic including map on each individual segments and then load them into sql one at a time. Later you can use pipeline or some other .NET component to pull data from SQL and rebatch the data. Handling big xml (10 times size as compared to flat file) in BizTalk messagebox is not a very good practice. If however it was a pure messaging scenario, you can convert file into stream and route it to destination.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Not the answer you're looking for? Browse other questions tagged or ask your own question.