Error with SalesForce Import

ATTENTION: This forum is no longer active. Please navigate to our new support site at https://support.starfishetl.com/
Viewing 9 posts (of 9 total)
Anosh Wadia
User - Author
Post count: 19
#1

Am running an import from SalesForce to SalesLogix and I get the following error. The odd issue is its not consistent since it sometimes occurs around row 499 and sometimes around row 500.

I believe there is a setting that needs to be adjusted to indicate how many rows should be read at a time. Does anyone recall where that setting is?

Thanks!
Anosh Wadia

Unhandled Error (1):

An exception occurred during the operation, making the result invalid. Check InnerException for exception details.

System.ServiceModel.FaultException: System.Web.Services.Protocols.SoapException: Server was unable to process request. ---> System.IndexOutOfRangeException: Index was outside the bounds of the array.
at StarfishEngine.StarfishService.GrabOriginRowData(rowdat& RowData)
at StarfishEngine.StarfishService.ExecJob(String JobID, Boolean Commit, Boolean ChainJobs, Int32 LoggingLevel, String Argument, String BeginAtRow, String EndAtRow, String RowThreadCount)
--- End of inner exception stack trace ---

Server stack trace:
at System.ServiceModel.Channels.ServiceChannel.HandleReply(ProxyOperationRuntime operation, ProxyRpc& rpc)
at System.ServiceModel.Channels.ServiceChannel.EndCall(String action, Object[] outs, IAsyncResult result)
at System.ServiceModel.Channels.ServiceChannelProxy.InvokeEndService(IMethodCallMessage methodCall, ProxyOperationRuntime operation)
at System.ServiceModel.Channels.ServiceChannelProxy.Invoke(IMessage message)

Exception rethrown at [0]:
at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)
at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)
at StarfishAdmin.StarfishService.StarfishServiceSoap.EndExecJob(IAsyncResult result)
at StarfishAdmin.StarfishService.StarfishServiceSoapClient.StarfishService_StarfishServiceSoap_EndExecJob(IAsyncResult result)
at StarfishAdmin.StarfishService.StarfishServiceSoapClient.OnEndExecJob(IAsyncResult result)
at System.ServiceModel.ClientBase`1.OnAsyncCallCompleted(IAsyncResult result)

Comments (9)

Aron Hoekstra's picture Aron Hoekstra
Administrator
Post count: 2
#2

Anosh, can you tell me what version of Starfish you're running? I think this was a bug that may have been fixed in the final release of 2.1.

Log in to post comments
Anosh Wadia's picture Anosh Wadia
User - Author
Post count: 19
#3

I am on v2.1.4210

Log in to post comments
Aron Hoekstra's picture Aron Hoekstra
Administrator
Post count: 2
#4

Ok, the final version of 2.1 was 2.1.4226

I think you are on a beta version now. Please download the latest version and try again.

Thank you

Log in to post comments
Anosh Wadia's picture Anosh Wadia
User - Author
Post count: 19
#5

Thanks Aron, I downloaded the latest version and it resolved the issue.

Log in to post comments
Anosh Wadia's picture Anosh Wadia
User - Author
Post count: 19
#6

Unfortunately it looks like I've run into another issue now. My import completed 5575 records of a total of 8358 but I then got the following error:

Total Runtime: 00:45:39.455

Sage.SData.Client.Core.SDataClientException: The remote server returned an error: (500) Internal Server Error. ---> Sage.SData.Client.Framework.SDataException: The remote server returned an error: (500) Internal Server Error. ---> System.Net.WebException: The remote server returned an error: (500) Internal Server Error.
at System.Net.HttpWebRequest.GetResponse()
at Sage.SData.Client.Framework.SDataRequest.GetResponse()
--- End of inner exception stack trace ---
at Sage.SData.Client.Framework.SDataRequest.GetResponse()
at Sage.SData.Client.Core.SDataService.ExecuteRequest(String url, RequestOperation operation, MediaType[] accept)
at Sage.SData.Client.Core.SDataService.CreateEntry(String url, AtomEntry entry)
--- End of inner exception stack trace ---
at Sage.SData.Client.Core.SDataService.CreateEntry(String url, AtomEntry entry)
at Sage.SData.Client.Core.SDataService.CreateEntry(SDataBaseRequest request, AtomEntry entry)
at Sage.SData.Client.Core.SDataSingleResourceRequest.Create()
at StarfishEngine.StarfishService.PostStageSData(rowdat& RowData, Stage st)

In addition I looked at the EventViewer on the server running the SData service and I see the following:
Description:
Exception caught during the processing of a message

Verb: POST
Uri: http://slx09/sdata/slx/dynamic/-/accounts('')?format=atomentry

Original Message: Unable to perform find[SQL: SQL not available]

Inner Exception Message: Exception of type 'System.OutOfMemoryException' was thrown.

Stack Trace: at NHibernate.Impl.SessionImpl.List(CriteriaImpl criteria, IList results)
at NHibernate.Impl.CriteriaImpl.List(IList results)
at NHibernate.Impl.CriteriaImpl.Subcriteria.UniqueResult[T]()
at Sage.Platform.NHibernateRepository.Query.Criteria.UniqueResult[T]()
at Sage.SalesLogix.Security.Owner.get_User()
at Sage.Integration.Entity.Adapter.OwnerRequest.CopyEntityToFeedEntry(IOwner entity, Owner entry, InclusionNode include) in c:Documents and SettingsAdministrator.SSSWORLD-LOCALApplication DataSagePlatformOutputsdataOwner.cs:line 300
at Sage.Platform.SData.Dynamic.DynamicRequestBase`3.CopyEntityParentToFeedEntry(TFeedEntry entry, TEntity entity, InclusionNode include)
at Sage.Integration.Entity.Adapter.AccountRequest.CopyEntityToFeedEntry(IAccount entity, Account entry, InclusionNode include) in c:Documents and SettingsAdministrator.SSSWORLD-LOCALApplication DataSagePlatformOutputsdataAccount.cs:line 1486
at Sage.Platform.SData.Dynamic.DynamicRequestBase`3.UpdateFeedEntryFromEntity(TFeedEntry entry, TEntity entity, InclusionNode include, Boolean includeSchema, Boolean isSingleEntry)
at Sage.Platform.SData.Dynamic.DynamicRequestBase`3.InternalPost(TFeedEntry entry)
at Sage.Platform.SData.Dynamic.DynamicRequestBase`3.DoPost(TFeedEntry entry)
at Sage.Integration.Entity.Adapter.AccountRequest.PostAccount(Account entry) in c:Documents and SettingsAdministrator.SSSWORLD-LOCALApplication DataSagePlatformOutputsdataAccount.cs:line 77
at Invokeba3e49f815894e00afdd5f3169a1f944.Invoke(Object , IRequest )
at Sage.Integration.Messaging.RequestTargetRegistration.RequestTargetInvoker.Invoke(IRequest request)
at Sage.Integration.Messaging.Request.Process(RequestTargetInvoker invoker)
at Sage.Integration.Messaging.MessagingService.Process(IRequest request)

Log in to post comments
Aron Hoekstra's picture Aron Hoekstra
Administrator
Post count: 2
#7

Hmm, I think the line "Inner Exception Message: Exception of type 'System.OutOfMemoryException' was thrown." is of particular interest here..

What kind of server are you running on and how much RAM does it have? Did you happen to take a look at the task manager while/after this ran to see how much each process was using? That may be helpful.

But in the meantime, you can always pick up the rest of the import where it left off. On the run tab, put 5576 in the "Begin at Row" box and run again.

Log in to post comments
Anosh Wadia's picture Anosh Wadia
User - Author
Post count: 19
#8

Hi Aron,

Am running this on an Intel Xeon 2.83 Ghz CPU with 4 GB RAM.
I did not check the task manager when this error occurred, I'll keep a look out if it occurs again and let you know.

I'm re-running the import now from that row and so far it has imported about 1000 additional rows.
The machine is also running SQL server and currently the task manager shows SQL server using 1.7 GB of RAM. However the task manager currently shows 640 MB Ram free (with the import running in the background). However as I type this I'm noticing the RAM drop to 560 MB and its falling steadily.....
Looking at the processes, the memory is being consumed by the w3wp.exe process which probably is the SData site being used for the import. I believe this is the culprit.

Thanks!
Anosh

Log in to post comments
Aron Hoekstra's picture Aron Hoekstra
Administrator
Post count: 2
#9

More than likely, but also keep in mind that Starfish uses IIS as well. Having SData, IIS, SQL and Starfish all on one machine may be a bit too much.

Log in to post comments
Anosh Wadia's picture Anosh Wadia
User - Author
Post count: 19
#10

Wanted to post an update....

Since I'm forced to run the import on a machine that does not have adequate memory resources, the import (and specifically the SData portal) would crash every 2000 rows or so.

It was getting pretty tedious to have to resume the import every 2000 rows so instead I adjusted the settings of the SData application pool so that it does not exceed a certain memory size. When it reaches this limit, it recycles the app pool. Thankfully this does not affect StarFish's import, all it does is causes a bit of delay while the Sdata site compiles again, and then the import continues as normal.

Thought I'd mention this in case someone else came across a similar issue in the future.

Log in to post comments
Viewing 9 posts (of 9 total)

Forum Login

Login or sign up for our forums to connect to the user community.

Reply

You must log in to post.

Not a Member? Register.