Showing posts with label memory. Show all posts
Showing posts with label memory. Show all posts

Friday, March 23, 2012

Out of virtual memory

I have one package that is constantly running out of virtual memory, right now I am on a development server, running only this package. The package is moving data from one table into another on the same server in the same database. The server has 3 Gb of memory and is only running SS2005 and SSIS. I am a local admin on the server and running the package through BIDS, once again for our initial testing. I tried setting the property BufferTempStoragePath to our E drive so it can utilize the 100 Gb of free space we have but that doesn't seem to work either. I have also tried setting the MaxRowSize to many different values to no avail. I am constantly getting an error, see below for exact error, when it gets through roughly half the load. Moreover it reports this error about 500 times in the progress report if I let the package run to completion. Finally, when all is said and done the package has moved the data successfully but the package always shows as failing.

I have googled continuously on this problem but have not found a resolution. I did see on a post here where it was recommended to run the package out of process, however I don't see the benefit at this point when this is the only package I am running. I also don't understand why it would report the error so many times and fail the package when it is completing successfully? Source and Destination have the same number of records at the end of the task. Could someone please try to make sense of this.

Getting Error:
[DTS.Pipeline]

Information: The buffer manager detected that the system was low on

virtual memory, but was unable to swap out any buffers. 4 buffers were

considered and 4 were locked. Either not enough memory is available to

the pipeline because not enough is installed, other processes are using

it, or too many buffers are locked

Thank you for your assistance!write a sql command like "select * from destination_table where 1=0",otherwise SSIS will retrieve all data from your destination table.|||Hi icewill,

Not quite sure I understand why I would do that, I do want all the records. Would you please elaborate? We are loading a data warehouse. Thanks.|||Hello there,

I got such problem a while ago. By reducing "DefaultBufferMaxRow" to certain amount (My package was set to 5000) and keep "DefaultBufferSize" as small as possible (default is 10MB, Max is 100MB), your package should run succefully. Please keep in mind that Test and messure could take little bit of time for reaching the best performance.

Good luck|||Hi

Steglitz,

I have tried numerous settings but none of

them successful. I have tried DefaultBufferMaxRow at the following

values 2000, 5000, 7500, 10000 along with DefaultBufferSize set to the

default, 25Mb, 50Mb and 75Mb.

When running

performance monitor the only difference I noticed was the number of

buffers went up as the values were increased. Here

is an article I found explaining buffer

sizing:

Understanding How the Data

Flow Task Sizes the Buffers

The

data flow engine begins the task of sizing its buffers by calculating

the estimated size of a single row of data. Then it multiplies the

estimated size of a row by the value of

DefaultBufferMaxRows to obtain a preliminary working

value for the buffer size.

If the result is more than

the value of DefaultBufferSize, the engine reduces

the number of rows.

|||Here is another issue to add on top of the previous... Once the package runs Sql Server is left holding all the memory to the system. We watched it both through Performance Monitor and through task manager and after an hour it had not released the memory... 1.4 Gb and holding. The only way we could get it to reduce was to restart the Sql Server service. Is there anyone else out there that seems to be having any of these issues?|||

Bagles,

Actually i dont have any idea about why you need data from destination table, i met the problem that you mentioned before, and at last i found if you select a table as your destination directly or write sql like "select * from destination_table", SSIS will retrieve all data from the database, if there is great number of records in your destination table,that will induce your SSIS out of memory, that is not what i want to see, my mainly purpose is to get the fields to map for, so i modified the sql like "select * from destination_table where 1=0",and the problem has been solved perfectly.

GL~

|||Hi Icewill,

Using a query like that to obtain the metadata makes sense, and I suppose this is probably a silly question, but are you then using an expression to inject the actual sql statement you want at runtime?

I am currently using the DataReader source component which does not allow that, should I be using the OLEDB Source instead? Overall is the DataReader component a bad choice?

Thank you all in advance!|||There are a variety of reasons your server could run out of virtual memory, with both SSIS and SQL2k5 running it, so a couple of questions first.

1. What OS? is it 32 bit or 64 bit?
2. What is the recovery model of your target database?
3. What destination type are you using? SQL Server destination or OLE DB Destination.
4. If you're using an OLE DB destination, what settings are being used, including DataAccessMode, and fast-load settings if applicable?
5. What is the volume of data your moving through the pipeline?

As for icewill's notion that SSIS via SQL is retriving all the data from the destination table, which may be relived by adding "1=0", that assertion is patently untrue and can be observed not to be the case using profiler.

We have partioned tables with billions of rows as target tables and in no case was all the data selected merely by chosing the table as a target, either during design or run-time.|||Greetings Jaegd,

1) We are using Windows Server 2003 Standard Edition with SP1, 32bit.
2) The recovery model is Simple
3) We are using the OLEDB destination with the data access mode of Table Name variable with Fast load.
4) For any package that is running into problems it occurs if there are more than several hundred thousand rows. I currently don't have a definitive threshold but usually if we are moving more than 400,000 rows we run into a problem.

So far here is what we have discovered through several days of testing;

1) It doesn't appear that SSIS is spooling (writing the stream to disk) as it should. The default setting is to the user's temp directory, we have currently set spooling to go to our E: drive due to a lack of space on our C: drive. We have been testing for a solid week and performance monitor has not reported 1 spooled buffer to date. We currently are profiling all of the SqlServer:SSIS Pipeline variables available in Performance monitor as well as the following general categories; SQLServer:Memory Manager, Processor, Physical Disk (all 3 drives), Memory (Available MBytes), and the .NET Data Provider for SqlServer.

2) We finally found the event handler for the logging of BufferResizing and from that we have gathered that SSIS will try to fill the buffer with the MAX amount of memory first (100Mb) and not the default buffer size of 10Mb. Please correct me if I am wrong but that is what we are witnessing.

One thing that would be great to see in the future is the amount of bytes Sql Server is calculating per row and the amount of bytes it is putting in the stream. This would allow people to gain a better understanding of how to use the setting DefaultBufferMaxRows, rather than just guessing at numbers, which is essentially what we have been doing.

3) Finally the biggest adjustment we have made, and which seems to be working, is to limit the amount of memory that Sql Server itself will use. As I stated in a previous post, Sql Server was using all the memory it could get and then not releasing it, which was killing the capabilities of Integration Services. We changed the setting in Sql Server on this particular box to allow Sql Server only half of the available memory which seems to have fixed the problem. Sql Server now will give back the memory that it has used over that setting which frees up enough for SSIS to run properly.

I would still love to hear from you and others that have experienced this problem or have advice on how to avoid it. The above is just what we have witnessed through our own trials and tribulations.

Thank you for your assistance and experience!|||I have taken an alternative route (which may or may not be viable in your implementation). That route is always running SSIS on separate hardware than SQL2k5 (there are licensing issues here naturally). Note, I'm not recommending SSIS/SQL separation for the general case, by the way, since separation adds substantial unnecessary network roundtrips, as well as eliminates the use of the high speed SQL Server destination.

However, SQL/SSIS separation does allow SQL Server "domination" over the memory/CPU on its hardware. Furthermore, and most importantly, separation allows the construction of an SSIS farm, that is, processing over N number of SSIS only machines.

As for x64 vs x86, generally speaking, running x64 SSIS gives me about a 30% boost (from a wall-clock perspective) over running x86 SSIS on the exact same job, an advantage which becomes greater at larger volume levels.

I know this doesn't speak to the spooling issue, or the optimization of the DefaultBufferMaxRows, since virtual memory has become an non-issue since moving to a hardware separation approach.|||Hi jaegd,

I really appreciate your input! Unfortunately I don't have any control over the hardware that is being used; however I do find it beneficial to know how others are using the product and found your use of it a nugget to hold onto for the future! Thanks again!|||

I have been thinking about creating a SSIS farm have you seen any documention on how to set this up and best practises?

I also installed x64 bit version of SSIS, any ideas how I can confirm this?

cheers

Marcus

Out of virtual memory

I have one package that is constantly running out of virtual memory, right now I am on a development server, running only this package. The package is moving data from one table into another on the same server in the same database. The server has 3 Gb of memory and is only running SS2005 and SSIS. I am a local admin on the server and running the package through BIDS, once again for our initial testing. I tried setting the property BufferTempStoragePath to our E drive so it can utilize the 100 Gb of free space we have but that doesn't seem to work either. I have also tried setting the MaxRowSize to many different values to no avail. I am constantly getting an error, see below for exact error, when it gets through roughly half the load. Moreover it reports this error about 500 times in the progress report if I let the package run to completion. Finally, when all is said and done the package has moved the data successfully but the package always shows as failing.

I have googled continuously on this problem but have not found a resolution. I did see on a post here where it was recommended to run the package out of process, however I don't see the benefit at this point when this is the only package I am running. I also don't understand why it would report the error so many times and fail the package when it is completing successfully? Source and Destination have the same number of records at the end of the task. Could someone please try to make sense of this.

Getting Error:
[DTS.Pipeline]

Information: The buffer manager detected that the system was low on

virtual memory, but was unable to swap out any buffers. 4 buffers were

considered and 4 were locked. Either not enough memory is available to

the pipeline because not enough is installed, other processes are using

it, or too many buffers are locked

Thank you for your assistance!write a sql command like "select * from destination_table where 1=0",otherwise SSIS will retrieve all data from your destination table.|||Hi icewill,

Not quite sure I understand why I would do that, I do want all the records. Would you please elaborate? We are loading a data warehouse. Thanks.|||Hello there,

I got such problem a while ago. By reducing "DefaultBufferMaxRow" to certain amount (My package was set to 5000) and keep "DefaultBufferSize" as small as possible (default is 10MB, Max is 100MB), your package should run succefully. Please keep in mind that Test and messure could take little bit of time for reaching the best performance.

Good luck|||Hi

Steglitz,

I have tried numerous settings but none of

them successful. I have tried DefaultBufferMaxRow at the following

values 2000, 5000, 7500, 10000 along with DefaultBufferSize set to the

default, 25Mb, 50Mb and 75Mb.

When running

performance monitor the only difference I noticed was the number of

buffers went up as the values were increased. Here

is an article I found explaining buffer

sizing:

Understanding How the Data

Flow Task Sizes the Buffers

The

data flow engine begins the task of sizing its buffers by calculating

the estimated size of a single row of data. Then it multiplies the

estimated size of a row by the value of

DefaultBufferMaxRows to obtain a preliminary working

value for the buffer size.

If the result is more than

the value of DefaultBufferSize, the engine reduces

the number of rows.

|||Here is another issue to add on top of the previous... Once the package runs Sql Server is left holding all the memory to the system. We watched it both through Performance Monitor and through task manager and after an hour it had not released the memory... 1.4 Gb and holding. The only way we could get it to reduce was to restart the Sql Server service. Is there anyone else out there that seems to be having any of these issues?|||

Bagles,

Actually i dont have any idea about why you need data from destination table, i met the problem that you mentioned before, and at last i found if you select a table as your destination directly or write sql like "select * from destination_table", SSIS will retrieve all data from the database, if there is great number of records in your destination table,that will induce your SSIS out of memory, that is not what i want to see, my mainly purpose is to get the fields to map for, so i modified the sql like "select * from destination_table where 1=0",and the problem has been solved perfectly.

GL~

|||Hi Icewill,

Using a query like that to obtain the metadata makes sense, and I suppose this is probably a silly question, but are you then using an expression to inject the actual sql statement you want at runtime?

I am currently using the DataReader source component which does not allow that, should I be using the OLEDB Source instead? Overall is the DataReader component a bad choice?

Thank you all in advance!|||There are a variety of reasons your server could run out of virtual memory, with both SSIS and SQL2k5 running it, so a couple of questions first.

1. What OS? is it 32 bit or 64 bit?
2. What is the recovery model of your target database?
3. What destination type are you using? SQL Server destination or OLE DB Destination.
4. If you're using an OLE DB destination, what settings are being used, including DataAccessMode, and fast-load settings if applicable?
5. What is the volume of data your moving through the pipeline?

As for icewill's notion that SSIS via SQL is retriving all the data from the destination table, which may be relived by adding "1=0", that assertion is patently untrue and can be observed not to be the case using profiler.

We have partioned tables with billions of rows as target tables and in no case was all the data selected merely by chosing the table as a target, either during design or run-time.|||Greetings Jaegd,

1) We are using Windows Server 2003 Standard Edition with SP1, 32bit.
2) The recovery model is Simple
3) We are using the OLEDB destination with the data access mode of Table Name variable with Fast load.
4) For any package that is running into problems it occurs if there are more than several hundred thousand rows. I currently don't have a definitive threshold but usually if we are moving more than 400,000 rows we run into a problem.

So far here is what we have discovered through several days of testing;

1) It doesn't appear that SSIS is spooling (writing the stream to disk) as it should. The default setting is to the user's temp directory, we have currently set spooling to go to our E: drive due to a lack of space on our C: drive. We have been testing for a solid week and performance monitor has not reported 1 spooled buffer to date. We currently are profiling all of the SqlServer:SSIS Pipeline variables available in Performance monitor as well as the following general categories; SQLServer:Memory Manager, Processor, Physical Disk (all 3 drives), Memory (Available MBytes), and the .NET Data Provider for SqlServer.

2) We finally found the event handler for the logging of BufferResizing and from that we have gathered that SSIS will try to fill the buffer with the MAX amount of memory first (100Mb) and not the default buffer size of 10Mb. Please correct me if I am wrong but that is what we are witnessing.

One thing that would be great to see in the future is the amount of bytes Sql Server is calculating per row and the amount of bytes it is putting in the stream. This would allow people to gain a better understanding of how to use the setting DefaultBufferMaxRows, rather than just guessing at numbers, which is essentially what we have been doing.

3) Finally the biggest adjustment we have made, and which seems to be working, is to limit the amount of memory that Sql Server itself will use. As I stated in a previous post, Sql Server was using all the memory it could get and then not releasing it, which was killing the capabilities of Integration Services. We changed the setting in Sql Server on this particular box to allow Sql Server only half of the available memory which seems to have fixed the problem. Sql Server now will give back the memory that it has used over that setting which frees up enough for SSIS to run properly.

I would still love to hear from you and others that have experienced this problem or have advice on how to avoid it. The above is just what we have witnessed through our own trials and tribulations.

Thank you for your assistance and experience!|||I have taken an alternative route (which may or may not be viable in your implementation). That route is always running SSIS on separate hardware than SQL2k5 (there are licensing issues here naturally). Note, I'm not recommending SSIS/SQL separation for the general case, by the way, since separation adds substantial unnecessary network roundtrips, as well as eliminates the use of the high speed SQL Server destination.

However, SQL/SSIS separation does allow SQL Server "domination" over the memory/CPU on its hardware. Furthermore, and most importantly, separation allows the construction of an SSIS farm, that is, processing over N number of SSIS only machines.

As for x64 vs x86, generally speaking, running x64 SSIS gives me about a 30% boost (from a wall-clock perspective) over running x86 SSIS on the exact same job, an advantage which becomes greater at larger volume levels.

I know this doesn't speak to the spooling issue, or the optimization of the DefaultBufferMaxRows, since virtual memory has become an non-issue since moving to a hardware separation approach.|||Hi jaegd,

I really appreciate your input! Unfortunately I don't have any control over the hardware that is being used; however I do find it beneficial to know how others are using the product and found your use of it a nugget to hold onto for the future! Thanks again!|||

I have been thinking about creating a SSIS farm have you seen any documention on how to set this up and best practises?

I also installed x64 bit version of SSIS, any ideas how I can confirm this?

cheers

Marcus

Wednesday, March 21, 2012

Out of memory while querying causes wrong data to be returned

Running AS 2000, SP4.
A couple of times while querying a cube, one member has got the wrong data displayad (the value are to low).
No matter if I reload the query afterwards, restarting the query application etc, the same faulty value still gets displayed for that member.

After I restart analysis services the value is correct again. No other changes, exact the same query etc.
While analysing the event viewer I see that analysises services has run out of memory before this strange behaviour (MSSQLServerOLAPService - "The server is out of memory").

Could this out-of-memory error has caused the wrong value to be returned, and the somehow the value has got cached on the analysis server so it's not fixed until the server has restarted?
This is the only theory I have, and it's important to figure out the cause. Users are getting worried if they can't trust the data.

The server is dedicated to analysis services and has 3GB memory and /3GB switch enabled. Memory conservation treshold is 2700mb and Minimum allocated memory is set to 1536mb.
The query tool is an asp-application using ADOMD, but the same faulty value was returned while trying the query in MDX Sample App.

First of all try installing SP4 http://www.microsoft.com/downloads/details.aspx?displaylang=en&FamilyID=8E2DFC8D-C20E-4446-99A9-B7F0213F8BC5

Second, if problem persists contact customer support and have them looking into your problem.

Edward.
--
This posting is provided "AS IS" with no warranties, and confers no rights.

|||Sorry, misprint, I am already running sp 4.|||

There isn't much to recommed in this case.

Have your problem looked at by the customer support ( the doctors) :)

HTH

Edward.
--
This posting is provided "AS IS" with no warranties, and confers no rights.

sql

Out of memory problem

We have SQL Server 2000 on a Win2k Server with 2GB of Ram. I have set the s
erver to Dynamically configure SQL Server memory. Currently the maximum memo
ry is set at 1.8GB. Over time SQL Server occupies most of the memory leaving
about 10 MB available memo
ry. I read that this is ok if the available memory gets down as low and tha
t SQL Server manages the memory fine from one of the microsoft articles. Wel
l, it keeps running this way for some time.. 1 to 2 days and then users star
t losing connections, canno
t make new connections and a stored procedure which creates a OLE object sta
rts failing with this message given at the bottom of the message. Note the O
LE object is created and run within the SQL Server memory and not outside of
it.
I had set the max memory for SQL Server as 2GB intially. I reduced this to 1
.8GB and yet after couple of days of functioning ok, the server started givi
ng these errors. So every time this happens we have to stop the SQL Server s
ervice and start again. Can
anyone let me know why this is happening and what the SQL Server settings sh
ould be to prevent this from happening.
Server: Msg 14, Level 16, State 1, Line 0
Not enough storage is available to complete this operation.
Server: Msg 10019, Level 16, State 1, Line 0
sp_OACreate has not yet been called successfully for this command batch.
OLE Automation Error Information
HRESULT: 0x80004005
Thanks,
VasuBased on the smptoms and this comment:
<<
Note the OLE object is created and run within the SQL Server memory and not
outside of it.
quote:

I'd be willing to bet an awful lot of money that there is a) a memory leak
in your COM object or b) you've hit a mem leak error in the MDAC stack (or
somewhere).
I'd make sure I'm on all the latest service packs for SQL and any software
that you could be calling from the COM object.
Also... you might be able to a) prove that the leak is related to your COM
object and b) work around the problem for a short period of time by running
the sp_OA procs outside the SQL memory space.
But I'd troubleshoot this as a memory leak error for now...
Brian Moran
Principal Mentor
Solid Quality Learning
SQL Server MVP
http://www.solidqualitylearning.com
"Vasu Venkat" <anonymous@.discussions.microsoft.com> wrote in message
news:7D8CD8F5-918D-41DB-974A-3AA3B6F3026A@.microsoft.com...[QUOTE]
> We have SQL Server 2000 on a Win2k Server with 2GB of Ram. I have set the

server to Dynamically configure SQL Server memory. Currently the maximum
memory is set at 1.8GB. Over time SQL Server occupies most of the memory
leaving about 10 MB available memory. I read that this is ok if the
available memory gets down as low and that SQL Server manages the memory
fine from one of the microsoft articles. Well, it keeps running this way for
some time.. 1 to 2 days and then users start losing connections, cannot make
new connections and a stored procedure which creates a OLE object starts
failing with this message given at the bottom of the message. Note the OLE
object is created and run within the SQL Server memory and not outside of
it.
quote:

> I had set the max memory for SQL Server as 2GB intially. I reduced this to

1.8GB and yet after couple of days of functioning ok, the server started
giving these errors. So every time this happens we have to stop the SQL
Server service and start again. Can anyone let me know why this is happening
and what the SQL Server settings should be to prevent this from happening.
quote:

> Server: Msg 14, Level 16, State 1, Line 0
> Not enough storage is available to complete this operation.
> Server: Msg 10019, Level 16, State 1, Line 0
> sp_OACreate has not yet been called successfully for this command batch.
> OLE Automation Error Information
> HRESULT: 0x80004005
> Thanks,
> Vasu
>

Out of memory problem

We have SQL Server 2000 on a Win2k Server with 2GB of Ram. I have set the server to Dynamically configure SQL Server memory. Currently the maximum memory is set at 1.8GB. Over time SQL Server occupies most of the memory leaving about 10 MB available memory. I read that this is ok if the available memory gets down as low and that SQL Server manages the memory fine from one of the microsoft articles. Well, it keeps running this way for some time.. 1 to 2 days and then users start losing connections, cannot make new connections and a stored procedure which creates a OLE object starts failing with this message given at the bottom of the message. Note the OLE object is created and run within the SQL Server memory and not outside of it
I had set the max memory for SQL Server as 2GB intially. I reduced this to 1.8GB and yet after couple of days of functioning ok, the server started giving these errors. So every time this happens we have to stop the SQL Server service and start again. Can anyone let me know why this is happening and what the SQL Server settings should be to prevent this from happening
Server: Msg 14, Level 16, State 1, Line
Not enough storage is available to complete this operation
Server: Msg 10019, Level 16, State 1, Line
sp_OACreate has not yet been called successfully for this command batch
OLE Automation Error Informatio
HRESULT: 0x8000400
Thanks
VasBased on the smptoms and this comment:
<<
Note the OLE object is created and run within the SQL Server memory and not
outside of it.
I'd be willing to bet an awful lot of money that there is a) a memory leak
in your COM object or b) you've hit a mem leak error in the MDAC stack (or
somewhere).
I'd make sure I'm on all the latest service packs for SQL and any software
that you could be calling from the COM object.
Also... you might be able to a) prove that the leak is related to your COM
object and b) work around the problem for a short period of time by running
the sp_OA procs outside the SQL memory space.
But I'd troubleshoot this as a memory leak error for now...
--
Brian Moran
Principal Mentor
Solid Quality Learning
SQL Server MVP
http://www.solidqualitylearning.com
"Vasu Venkat" <anonymous@.discussions.microsoft.com> wrote in message
news:7D8CD8F5-918D-41DB-974A-3AA3B6F3026A@.microsoft.com...
> We have SQL Server 2000 on a Win2k Server with 2GB of Ram. I have set the
server to Dynamically configure SQL Server memory. Currently the maximum
memory is set at 1.8GB. Over time SQL Server occupies most of the memory
leaving about 10 MB available memory. I read that this is ok if the
available memory gets down as low and that SQL Server manages the memory
fine from one of the microsoft articles. Well, it keeps running this way for
some time.. 1 to 2 days and then users start losing connections, cannot make
new connections and a stored procedure which creates a OLE object starts
failing with this message given at the bottom of the message. Note the OLE
object is created and run within the SQL Server memory and not outside of
it.
> I had set the max memory for SQL Server as 2GB intially. I reduced this to
1.8GB and yet after couple of days of functioning ok, the server started
giving these errors. So every time this happens we have to stop the SQL
Server service and start again. Can anyone let me know why this is happening
and what the SQL Server settings should be to prevent this from happening.
> Server: Msg 14, Level 16, State 1, Line 0
> Not enough storage is available to complete this operation.
> Server: Msg 10019, Level 16, State 1, Line 0
> sp_OACreate has not yet been called successfully for this command batch.
> OLE Automation Error Information
> HRESULT: 0x80004005
> Thanks,
> Vasu
>

Out of memory Exception when Running a package with XML Task

Hi..

I'm running a package that has a XML Task in the control flow. This task tranforms a XML file with a XSLT.

The file is about 2 megs on a daily basis, but at the end of the month there is a full dump of data that makes the file to be around 400 megs. There is where my problem is.

I run this on my 2 GB memory workstation and when the memory gauge on the task manager reaches about 1.5gb the package fails with an "Out of memory exception".

I also run this package on a 8GB Ram server, and same applies.

Is there any way of making this package utilize all the available memory, I even increased the virtual memory to see if that helped my issue, but nothing.

Thanks

Are you running on a 32-bit or 64-bit OS? 32-bit servers are generally limited to about 2 gb of addressable memory, unless you have made some changes to the server configuration.|||

I'm running 32-bit... I wasn't aware of that!! I hope my admins knew that and made whatever change (I'm been highly opomistic, givin them the benefit of the doubt).

**UPDATE**

Yes... The server has the "/PAE" switch in the boot.ini, so its configured to use the 8 gigs of memory available to it.

|||Are other things running on the server - for example, the SQL Server relational engine? How much memory is allocated to it?|||

I'm looking at the Performance Tab in Task Manager and there is

Total Physical Memory: 8.32 Gigs

Available Physical Mem: 7.12 Gigs

From that 1 gig of memory taken is about 600 Megs devoted to System Cache and the rest is commited to running programs like SQL Server.

The only SQL server 2005 services that are running are

SSIS (of course) Smile

Full Text Search

SQL Server

SQL Server Agent

Again... Thanks for all your help

|||

This is a good work around.

I used a Script Task (I find myself using it more and more as I find limitations in SSIS) and with 3 simple lines of code I resolved the issue.

Code Snippet

Dim xslt As New XslTransform()
xslt.Load("C:\Temp\Template.xslt")
xslt.Transform("C:\Temp\XMLSource.xml", "C:\Temp\XMLSource_Revised.xml")

This code performs the task in 5 mins -/+ consumming about 900 megs of memory space, for a 500 meg file.

Out of memory Exception when Running a package with XML Task

Hi..

I'm running a package that has a XML Task in the control flow. This task tranforms a XML file with a XSLT.

The file is about 2 megs on a daily basis, but at the end of the month there is a full dump of data that makes the file to be around 400 megs. There is where my problem is.

I run this on my 2 GB memory workstation and when the memory gauge on the task manager reaches about 1.5gb the package fails with an "Out of memory exception".

I also run this package on a 8GB Ram server, and same applies.

Is there any way of making this package utilize all the available memory, I even increased the virtual memory to see if that helped my issue, but nothing.

Thanks

Are you running on a 32-bit or 64-bit OS? 32-bit servers are generally limited to about 2 gb of addressable memory, unless you have made some changes to the server configuration.|||

I'm running 32-bit... I wasn't aware of that!! I hope my admins knew that and made whatever change (I'm been highly opomistic, givin them the benefit of the doubt).

**UPDATE**

Yes... The server has the "/PAE" switch in the boot.ini, so its configured to use the 8 gigs of memory available to it.

|||Are other things running on the server - for example, the SQL Server relational engine? How much memory is allocated to it?|||

I'm looking at the Performance Tab in Task Manager and there is

Total Physical Memory: 8.32 Gigs

Available Physical Mem: 7.12 Gigs

From that 1 gig of memory taken is about 600 Megs devoted to System Cache and the rest is commited to running programs like SQL Server.

The only SQL server 2005 services that are running are

SSIS (of course) Smile

Full Text Search

SQL Server

SQL Server Agent

Again... Thanks for all your help

|||

This is a good work around.

I used a Script Task (I find myself using it more and more as I find limitations in SSIS) and with 3 simple lines of code I resolved the issue.

Code Snippet

Dim xslt As New XslTransform()
xslt.Load("C:\Temp\Template.xslt")
xslt.Transform("C:\Temp\XMLSource.xml", "C:\Temp\XMLSource_Revised.xml")

This code performs the task in 5 mins -/+ consumming about 900 megs of memory space, for a 500 meg file.

Out of Memory Exception

I'm getting an out of memory exception when designing by report model and
selecting save. I'm using 64bit windows xp 4Gb Ram and two dual core xeons
so presumably this is a bug! Anyone have any ideas?Hello Phil,
I would like to know whether there is any related error in the Report Log
File.
Would you please try to check the Report log File and post the error here
for me to troubleshooting?
By default, the log file is located at \Microsoft SQL Server\<SQL Server
Instance>\Reporting Services\LogFiles.
Please let me know the resault. Thank you!
Sincerely,
Wei Lu
Microsoft Online Community Support
==================================================
Get notification to my posts through email? Please refer to
http://msdn.microsoft.com/subscriptions/managednewsgroups/default.aspx#notif
ications.
Note: The MSDN Managed Newsgroup support offering is for non-urgent issues
where an initial response from the community or a Microsoft Support
Engineer within 1 business day is acceptable. Please note that each follow
up response may take approximately 2 business days as the support
professional working with you may need further investigation to reach the
most efficient resolution. The offering is not appropriate for situations
that require urgent, real-time or phone-based interactions or complex
project analysis and dump analysis issues. Issues of this nature are best
handled working with a dedicated Microsoft Support Engineer by contacting
Microsoft Customer Support Services (CSS) at
http://msdn.microsoft.com/subscriptions/support/default.aspx.
==================================================(This posting is provided "AS IS", with no warranties, and confers no
rights.)|||Hi Phil,
How is everything going? Please feel free to let me know if you need any
assistance.
Sincerely,
Wei Lu
Microsoft Online Community Supportsql

Out of memory error when running large report

I have built a large report but when I try to execute inside of the Report
Designer, I get the following error and then an abort:
[An error has occurred during report processing
Exception of type System.OutOfMemoryException was thrown]
I have applied SRS SP1 and here is the MSD info:
Microsoft Visual Basic .NET 69457-005-1826837-18410
Microsoft SQL Server Report Designer Version 8.00.878.00
Running on XP SP2 with 1.25 gig RAM
Is there a work-around for this?
Thanks for any help!http://support.microsoft.com/?kbid=840709
This article may help you.
| From: "Mike Harbinger" <MikeH@.Cybervillage.net>
| Subject: Out of memory error when running large report
| Date: Thu, 6 Jan 2005 18:47:55 -0800
| Lines: 18
| X-Priority: 3
| X-MSMail-Priority: Normal
| X-Newsreader: Microsoft Outlook Express 6.00.2900.2180
| X-MimeOLE: Produced By Microsoft MimeOLE V6.00.2900.2180
| X-RFC2646: Format=Flowed; Original
| Message-ID: <#$rvPNG9EHA.1408@.TK2MSFTNGP10.phx.gbl>
| Newsgroups: microsoft.public.sqlserver.reportingsvcs
| NNTP-Posting-Host: 69.106.50.114
| Path:
cpmsftngxa10.phx.gbl!TK2MSFTNGXA03.phx.gbl!TK2MSFTNGP08.phx.gbl!TK2MSFTNGP10|||Thanks for the reponse Brad, but I am not getting the error from the report
server but from inside of the .Net Visual Studio Reporting Services
designer.
""Brad Syputa - MS"" <bradsy@.Online.Microsoft.com> wrote in message
news:g7L9ZjO9EHA.3200@.cpmsftngxa10.phx.gbl...
> http://support.microsoft.com/?kbid=840709
> This article may help you.
>
> --
> | From: "Mike Harbinger" <MikeH@.Cybervillage.net>
> | Subject: Out of memory error when running large report
> | Date: Thu, 6 Jan 2005 18:47:55 -0800
> | Lines: 18
> | X-Priority: 3
> | X-MSMail-Priority: Normal
> | X-Newsreader: Microsoft Outlook Express 6.00.2900.2180
> | X-MimeOLE: Produced By Microsoft MimeOLE V6.00.2900.2180
> | X-RFC2646: Format=Flowed; Original
> | Message-ID: <#$rvPNG9EHA.1408@.TK2MSFTNGP10.phx.gbl>
> | Newsgroups: microsoft.public.sqlserver.reportingsvcs
> | NNTP-Posting-Host: 69.106.50.114
> | Path:
> cpmsftngxa10.phx.gbl!TK2MSFTNGXA03.phx.gbl!TK2MSFTNGP08.phx.gbl!TK2MSFTNGP10
> phx.gbl
> | Xref: cpmsftngxa10.phx.gbl
> microsoft.public.sqlserver.reportingsvcs:38909
> | X-Tomcat-NG: microsoft.public.sqlserver.reportingsvcs
> |
> | I have built a large report but when I try to execute inside of the
> Report
> | Designer, I get the following error and then an abort:
> |
> | [An error has occurred during report processing
> | Exception of type System.OutOfMemoryException was thrown]
> |
> |
> | I have applied SRS SP1 and here is the MSD info:
> | Microsoft Visual Basic .NET 69457-005-1826837-18410
> | Microsoft SQL Server Report Designer Version 8.00.878.00
> |
> | Running on XP SP2 with 1.25 gig RAM
> |
> | Is there a work-around for this?
> |
> | Thanks for any help!
> |
> |
> |
>

Out of Memory Error when running a Select Statement

Hi all,

In the object Explorer in SQL Server 2005 Management Studio, if I right click on the table I want to open and choose open, I eventually get an out of memory error with an error source: MSCORLIB.

This table contains 40m records, but when it opens the table it goes past the 40m and wants to open 83m records, and then eventually comes up with an out of memory error.

The properties of the table indicates that there are 40m records, but when you open the table, at the bottom, where it says retrieving data, it is counting of numbe rof records that surpass the 40m total.

Is it because the database is too big, or is there a way to configure SQL Server 2005 to use X amount of memory?

Regards

Mark:

You might want to temper opening large tables with Management Studio or other tools. If your table contains 40 million rows of data and each row of data is on average 1000 bytes per row, opening this table might take 40 GB of memory to open and display the data. Be careful about viewing all rows of large tables.


Dave

|||

Cheers Dave,

I have used Management Studio, but you are probably right about the memory requirement.

Thanks

Out of memory error

I'm trying to generate a very large report. My server's got 32 gigs of memory though, so memory shouldn't be an issue. However, when I try to generate my report, I get the following outofmemory error message in the error log on the server:

EventType clr20r3, P1 w3wp.exe, P2 6.0.3790.3959, P3 45d6968e, P4 mscorlib, P5 2.0.0.0, P6 4333ab80, P7 116e, P8 29, P9 system.outofmemoryexception, P10 NIL.

The w3wp.exe process maxes out at ~1,250 MB. I imagine I just have to change a setting somewhere to allow this to grow beyond that, but I can't seem to find that setting. What do I have to do?

Thanks!

-Carl

Were you able to resolve this issue?

Anyone have any ideas?

|||I am receiving this same error in the desktop development studio - system.outofmemory was thrown with 1044612 rows. It runs fine in the "Data" tab and in management studio but when I try to preview the report in the "Preview "tab I get this error. Any resolution for this?|||

I am having the same problem - the query runs perfectly fine in Management studio, or the Data tab - but when the report is generated it gives out of memory error.

Is there a resolution, or a workaround (would a custom ITemporaryStorage class resolve this issue - I can't find any example code or discussion on this except MSDN api docs)

Out of memory error

I'm trying to generate a very large report. My server's got 32 gigs of memory though, so memory shouldn't be an issue. However, when I try to generate my report, I get the following outofmemory error message in the error log on the server:

EventType clr20r3, P1 w3wp.exe, P2 6.0.3790.3959, P3 45d6968e, P4 mscorlib, P5 2.0.0.0, P6 4333ab80, P7 116e, P8 29, P9 system.outofmemoryexception, P10 NIL.

The w3wp.exe process maxes out at ~1,250 MB. I imagine I just have to change a setting somewhere to allow this to grow beyond that, but I can't seem to find that setting. What do I have to do?

Thanks!

-Carl

Were you able to resolve this issue?

Anyone have any ideas?

|||I am receiving this same error in the desktop development studio - system.outofmemory was thrown with 1044612 rows. It runs fine in the "Data" tab and in management studio but when I try to preview the report in the "Preview "tab I get this error. Any resolution for this?|||

I am having the same problem - the query runs perfectly fine in Management studio, or the Data tab - but when the report is generated it gives out of memory error.

Is there a resolution, or a workaround (would a custom ITemporaryStorage class resolve this issue - I can't find any example code or discussion on this except MSDN api docs)

Out of memory error

I'm trying to generate a very large report. My server's got 32 gigs of memory though, so memory shouldn't be an issue. However, when I try to generate my report, I get the following outofmemory error message in the error log on the server:

EventType clr20r3, P1 w3wp.exe, P2 6.0.3790.3959, P3 45d6968e, P4 mscorlib, P5 2.0.0.0, P6 4333ab80, P7 116e, P8 29, P9 system.outofmemoryexception, P10 NIL.

The w3wp.exe process maxes out at ~1,250 MB. I imagine I just have to change a setting somewhere to allow this to grow beyond that, but I can't seem to find that setting. What do I have to do?

Thanks!

-Carl

Were you able to resolve this issue?

Anyone have any ideas?

|||I am receiving this same error in the desktop development studio - system.outofmemory was thrown with 1044612 rows. It runs fine in the "Data" tab and in management studio but when I try to preview the report in the "Preview "tab I get this error. Any resolution for this?|||

I am having the same problem - the query runs perfectly fine in Management studio, or the Data tab - but when the report is generated it gives out of memory error.

Is there a resolution, or a workaround (would a custom ITemporaryStorage class resolve this issue - I can't find any example code or discussion on this except MSDN api docs)

sql

Out Of Memory

Hi Guys,
What i'm trying here is to run a .bat file containing Osql Command's on the server using asp. Which is Generating Out Of Memory Error.

In Detail.

This .bat file is having a Osql command to connect to a database with a parameter to run a .sql file and output parameter to give to output in .txt file. This is working fine if I simply run the .bat file on the server. its working fine no probes, but when I try the same from a .asp file running this .asp file from the same machine its giving me the Out of Memory Error.


Content of the .bat file is
@.cd
@.osql -U user -P Password -i c:\BatchUpdateTest\test.sql -o c:\BatchUpdateTest\test.txt
@.pause


Content of the .sql file is
SELECT * INTO ARC_TABLENAME TABLENAME
GO

DECLARE @.del_error int
DECLARE @.description VARCHAR(1000)

SELECT @.del_error= @.@.ERROR
SELECT @.description= description FROM MASTER..SYSMESSAGES WHERE error = @.del_error

IF @.del_error = 0
SELECT * FROM ARC_TABLENAME
ELSE
PRINT @.description+' ARC_TABLENAME'
GO


Content of the .asp file is

<%
var server_shell = Server.CreateObject("wscript.shell")
try{
server_shell.Run("c:\BatchUpdateTest\test.bat",1)
}catch(e){
Response.Write(e.description)
}
%>



Actualy when i run this code in the .bat file from .asp it did not give any error. So what i did is put a try catch block to trap the error which gives me this error Out of memory.

What should i do to solve this problem
Plz Help me out in this

Thanks in advance
Dinesh

I can't tell exactly what is going on here, but it seems to be limited to the web access method, since it works correctly from the batch file. So I would limit my search to that area - you might want to try turning on garbage collection:

http://support.microsoft.com/kb/911716

Buck Woody

Out of Memory

Hi Guys,
What i'm trying here is to run a .bat file containing Osql Command's on the server using asp. Which is Generating Out Of Memory Error.

In Detail.

This .bat file is having a Osql command to connect to a database with a parameter to run a .sql file and output parameter to give to output in .txt file. This is working fine if I simply run the .bat file on the server. its working fine no probes, but when I try the same from a .asp file running this .asp file from the same machine its giving me the Out of Memory Error.


Content of the .bat file is
@.cd
@.osql -U user -P Password -i c:\BatchUpdateTest\test.sql -o c:\BatchUpdateTest\test.txt
@.pause

Content of the .sql file is
SELECT * INTO ARC_TABLENAME TABLENAME
GO

DECLARE @.del_error int
DECLARE @.description VARCHAR(1000)

SELECT @.del_error= @.@.ERROR
SELECT @.description= description FROM MASTER..SYSMESSAGES WHERE error = @.del_error

IF @.del_error = 0
SELECT * FROM ARC_TABLENAME
ELSE
PRINT @.description+' ARC_TABLENAME'
GO


Content of the .asp file is
<%
var server_shell = Server.CreateObject("wscript.shell")
try{
server_shell.Run("c:\BatchUpdateTest\test.bat",1)
}catch(e){
Response.Write(e.description)
}
%>

Actualy when i run this code in the .bat file from .asp it did not give any error. So what i did is put a try catch block to trap the error which gives me this error Out of memory.

What should i do to solve this problem
Plz Help me out in this

Thanks in advance
Dinesh
You need to troubleshoot from ASP side. It is unlikely this has anything to do with SQL or the batch file itself.

Out Of Memory

Hi Guys,
What i'm trying here is to run a .bat file containing Osql Command's on the server using asp. Which is Generating Out Of Memory Error.

In Detail.

This .bat file is having a Osql command to connect to a database with a parameter to run a .sql file and output parameter to give to output in .txt file. This is working fine if I simply run the .bat file on the server. its working fine no probes, but when I try the same from a .asp file running this .asp file from the same machine its giving me the Out of Memory Error.


Content of the .bat file is
@.cd
@.osql -U user -P Password -i c:\BatchUpdateTest\test.sql -o c:\BatchUpdateTest\test.txt
@.pause


Content of the .sql file is
SELECT * INTO ARC_TABLENAME TABLENAME
GO

DECLARE @.del_error int
DECLARE @.description VARCHAR(1000)

SELECT @.del_error= @.@.ERROR
SELECT @.description= description FROM MASTER..SYSMESSAGES WHERE error = @.del_error

IF @.del_error = 0
SELECT * FROM ARC_TABLENAME
ELSE
PRINT @.description+' ARC_TABLENAME'
GO


Content of the .asp file is

<%
var server_shell = Server.CreateObject("wscript.shell")
try{
server_shell.Run("c:\BatchUpdateTest\test.bat",1)
}catch(e){
Response.Write(e.description)
}
%>



Actualy when i run this code in the .bat file from .asp it did not give any error. So what i did is put a try catch block to trap the error which gives me this error Out of memory.

What should i do to solve this problem
Plz Help me out in this

Thanks in advance
Dinesh

I can't tell exactly what is going on here, but it seems to be limited to the web access method, since it works correctly from the batch file. So I would limit my search to that area - you might want to try turning on garbage collection:

http://support.microsoft.com/kb/911716

Buck Woody

Out of Memory

Hi,

Is there a maximum threshold for the objects that I can place in 1 package? I'm encountering an intermittent out of memory in IS. And if I am able to run the project, it hangs the processing of the objects.

any idea how i can resolve this?

cherriesh

Not sure if there is a maximum (maybe one of the MSFT SSIS team could comment), but I've never encountered one. How large is the package? I've found it is better to break really large packages into smaller ones (< 1.5 mb in size).|||ic... mine has reached 10 mb...

Out of Memery Error while reading records

Hi,

I'm using CR9 with PostgreSQL.

While trying to run a report for a particular date, it returns

"Failed to open rowset"

"Out of memory while trying to read tuples"

But while running the same thing as a sql query from Postgres by limiting the records count it's working fine. So I think the problem is with the crystal reports.

Is there any way to limit the no of records? Also I tried to limit the records in CR RAS. It's not working.

Any idea, pls. help me

ThanksWhat's the maximum size of the table/ Max no of records Crystal Reports Support?

OR

Is it depends on the physical size of the machine.

I caught by the error "Out of Memory while reading tuples". The same table is working while filtering the records by another field which returns smaller no of records.

Have any idea, pls share with me.

Thankssql

Tuesday, March 20, 2012

OTP: help

I just want to know how I increase my virtual memory. I know i can, I did it on my old PC, but can't remember how. Please help stop the frezzing of my gameing.

As you may have guessed when you posted in a 'SQL Server replication' forum, this forum is not the right place for this question. In fact, none of these forums are right for this question. You need to find a forum dedicated to consumer level questions. And you probably need to buy more RAM.

Friday, March 9, 2012

OSQL question

From a command prompt I try to connect to my newly installed isntance of
MSDE. I use the following:
osql -U sa
and I get:
[SHARED MEMORY] Invalid Connection
[SHARED MEMORY] ConnectionOpen(InvalidInstance())
What am I doing wrong here?
Thanks for the information.
Brad
Did you install a named instance?
Specify the server name on the osql command line.
Also, I hope you didn't setup your MSDE instance with a blank password.
Jim
"Brad" <allcomppc@.sbcglobal.net> wrote in message
news:OsYsz1dyEHA.1204@.TK2MSFTNGP10.phx.gbl...
> From a command prompt I try to connect to my newly installed isntance of
> MSDE. I use the following:
> osql -U sa
> and I get:
> [SHARED MEMORY] Invalid Connection
> [SHARED MEMORY] ConnectionOpen(InvalidInstance())
> What am I doing wrong here?
> Thanks for the information.
> Brad
>
|||I did install a named instance with also setting the sa password.
I tried using the -S <server> but I get the same message.
Brad
"Jim Young" <thorium48@.hotmail.com> wrote in message
news:%23nHX%23igyEHA.1392@.tk2msftngp13.phx.gbl...
> Did you install a named instance?
> Specify the server name on the osql command line.
> Also, I hope you didn't setup your MSDE instance with a blank password.
> Jim
> "Brad" <allcomppc@.sbcglobal.net> wrote in message
> news:OsYsz1dyEHA.1204@.TK2MSFTNGP10.phx.gbl...
>
|||try OSQL.EXE -L
[vbcol=seagreen]
>I did install a named instance with also setting the sa password.
> I tried using the -S <server> but I get the same message.
> "Jim Young" <thorium48@.hotmail.com> wrote in message
> news:%23nHX%23igyEHA.1392@.tk2msftngp13.phx.gbl...
|||Try -S <server>\<Instance name>
Michael Tissington
http://www.oaklodge.com
http://www.tabtag.com
"Brad" <allcomppc@.sbcglobal.net> wrote in message
news:%23PKaFvmyEHA.2316@.TK2MSFTNGP15.phx.gbl...
>I did install a named instance with also setting the sa password.
> I tried using the -S <server> but I get the same message.
> Brad
> "Jim Young" <thorium48@.hotmail.com> wrote in message
> news:%23nHX%23igyEHA.1392@.tk2msftngp13.phx.gbl...
>