Showing posts with label type. Show all posts
Showing posts with label type. Show all posts

Thursday, March 29, 2012

EXEC stored procedure

I have a stored procedure that creates temp tables. It has two parameters,
@.tablename and @.type. I execute it with the following command.
EXEC _sp_CreateTempTable 'TableA', 'T'
With these parameters, this will create a temp table named ##TMP_TableA
based on the original table TableA.
The problem is when I run this on one of our servers it works and when I run
it on another I get the following error.
Msg 102, Level 15, State 1, Line 1
Incorrect syntax near ','.
Help!
Dave
Can you post the proc?
Hope this helps.
Dan Guzman
SQL Server MVP
"DaveF" <dave@.aol.com> wrote in message news:5Ingh.8974$495.126@.trnddc06...
>I have a stored procedure that creates temp tables. It has two parameters,
>@.tablename and @.type. I execute it with the following command.
> EXEC _sp_CreateTempTable 'TableA', 'T'
> With these parameters, this will create a temp table named ##TMP_TableA
> based on the original table TableA.
> The problem is when I run this on one of our servers it works and when I
> run it on another I get the following error.
> Msg 102, Level 15, State 1, Line 1
> Incorrect syntax near ','.
> Help!
> Dave
>
|||Unfortunately not. It is proprietary information.
Do you think the error is coming from within the executed proc?
Dave
"Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> wrote in message
news:edQytE$HHHA.420@.TK2MSFTNGP06.phx.gbl...
> Can you post the proc?
> --
> Hope this helps.
> Dan Guzman
> SQL Server MVP
> "DaveF" <dave@.aol.com> wrote in message
> news:5Ingh.8974$495.126@.trnddc06...
>
|||> Do you think the error is coming from within the executed proc?
Yes. I suspect the proc is executing a dynamic SQL statement (since you are
passing the table name parameter) and that's where the error is.
Hope this helps.
Dan Guzman
SQL Server MVP
"DaveF" <dave@.aol.com> wrote in message news:6Ungh.9006$495.7049@.trnddc06...
> Unfortunately not. It is proprietary information.
> Do you think the error is coming from within the executed proc?
> Dave
> "Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> wrote in message
> news:edQytE$HHHA.420@.TK2MSFTNGP06.phx.gbl...
>
|||I have a solution for you but unfortunately can't post it because it's
proprietary.
Just kidding. Try this and see if it works any better:
EXEC _sp_CreateTempTable @.tablename = 'TableA', @.type = 'T'
Make sure the SP is the same version (accepts the same # of parameters) on
both servers.
"DaveF" <dave@.aol.com> wrote in message news:5Ingh.8974$495.126@.trnddc06...
>I have a stored procedure that creates temp tables. It has two parameters,
>@.tablename and @.type. I execute it with the following command.
> EXEC _sp_CreateTempTable 'TableA', 'T'
> With these parameters, this will create a temp table named ##TMP_TableA
> based on the original table TableA.
> The problem is when I run this on one of our servers it works and when I
> run it on another I get the following error.
> Msg 102, Level 15, State 1, Line 1
> Incorrect syntax near ','.
> Help!
> Dave
>
|||Thanks for the idea but that didn't work.
I decided that since it worked on one machine and not the other that I would
compare Management Studio options and sure enough I found a difference. I
made the change and the error went away. Don't ask me why this particular
change fixed it because I don't know why.
Options-->Query Execution-->SQL Server-->Advanced
check SET CONCAT_NULL_YIELDS_NULL
Dave
"Mike C#" <xyz@.xyz.com> wrote in message
news:%23loaRX$HHHA.1468@.TK2MSFTNGP04.phx.gbl...
>I have a solution for you but unfortunately can't post it because it's
>proprietary.
> Just kidding. Try this and see if it works any better:
> EXEC _sp_CreateTempTable @.tablename = 'TableA', @.type = 'T'
> Make sure the SP is the same version (accepts the same # of parameters) on
> both servers.
> "DaveF" <dave@.aol.com> wrote in message
> news:5Ingh.8974$495.126@.trnddc06...
>
|||"DaveF" <dave@.aol.com> wrote in message news:i2Hgh.51$hy6.13@.trnddc05...
> Thanks for the idea but that didn't work.
> I decided that since it worked on one machine and not the other that I
> would compare Management Studio options and sure enough I found a
> difference. I made the change and the error went away. Don't ask me why
> this particular change fixed it because I don't know why.
> Options-->Query Execution-->SQL Server-->Advanced
> check SET CONCAT_NULL_YIELDS_NULL
Sounds like there's a NULL value being concatenated into a string somewhere
in your SP, most likely in some dynamic SQL you're trying to execute. That
setting controls null concatenation in strings. When it's set to ON,
concatenating NULL to a string returns NULL. When it's set to OFF, when a
NULL is concatenated to a string it's treated like '' - an empty string.
Here's an example that shows the difference:
DECLARE @.table VARCHAR(255)
SELECT @.table = 'syscomments'
DECLARE @.sql VARCHAR(255)
SELECT @.sql = 'SELECT TOP 10 * FROM ' + @.table
SELECT 'Table: syscomments', @.sql
SET CONCAT_NULL_YIELDS_NULL ON
SELECT @.table = NULL
SELECT @.sql = 'SELECT TOP 10 * FROM ' + @.table
SELECT 'Table: NULL, CONCAT_NULL_YIELDS_NULL ON', @.sql
SET CONCAT_NULL_YIELDS_NULL OFF
SELECT @.table = NULL
SELECT @.sql = 'SELECT TOP 10 * FROM ' + @.table
SELECT 'Table: NULL, CONCAT_NULL_YIELDS_NULL ON', @.sql
GO
SET CONCAT_NULL_YIELDS_NULL ON
When CONCAT_NULL_YIELDS_NULL is ON, appending NULL to a string returns NULL.
SQL Server will EXEC a NULL dynamic SQL string without complaint. When
CONCAT_NULL_YIELDS_NULL is OFF, the table name in this SELECT query is
appended as an empty string which will cause an error when SQL Server tries
to execute it.
Based on what you said, it sounds like one of the dynamic SQL statements
you're trying to execute has a NULL value appended to it somewhere, and
that's causing your error when that setting is OFF. It also means that that
particular dynamic SQL statement you think is being executed is not being
executed. Might want to check all your dynamic SQL statements that should
be executing (SELECT or PRINT them out right before the EXEC statement).
sql

EXEC stored procedure

I have a stored procedure that creates temp tables. It has two parameters,
@.tablename and @.type. I execute it with the following command.
EXEC _sp_CreateTempTable 'TableA', 'T'
With these parameters, this will create a temp table named ##TMP_TableA
based on the original table TableA.
The problem is when I run this on one of our servers it works and when I run
it on another I get the following error.
Msg 102, Level 15, State 1, Line 1
Incorrect syntax near ','.
Help!
DaveCan you post the proc?
Hope this helps.
Dan Guzman
SQL Server MVP
"DaveF" <dave@.aol.com> wrote in message news:5Ingh.8974$495.126@.trnddc06...
>I have a stored procedure that creates temp tables. It has two parameters,
>@.tablename and @.type. I execute it with the following command.
> EXEC _sp_CreateTempTable 'TableA', 'T'
> With these parameters, this will create a temp table named ##TMP_TableA
> based on the original table TableA.
> The problem is when I run this on one of our servers it works and when I
> run it on another I get the following error.
> Msg 102, Level 15, State 1, Line 1
> Incorrect syntax near ','.
> Help!
> Dave
>|||Unfortunately not. It is proprietary information.
Do you think the error is coming from within the executed proc?
Dave
"Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> wrote in message
news:edQytE$HHHA.420@.TK2MSFTNGP06.phx.gbl...
> Can you post the proc?
> --
> Hope this helps.
> Dan Guzman
> SQL Server MVP
> "DaveF" <dave@.aol.com> wrote in message
> news:5Ingh.8974$495.126@.trnddc06...
>|||> Do you think the error is coming from within the executed proc?
Yes. I suspect the proc is executing a dynamic SQL statement (since you are
passing the table name parameter) and that's where the error is.
Hope this helps.
Dan Guzman
SQL Server MVP
"DaveF" <dave@.aol.com> wrote in message news:6Ungh.9006$495.7049@.trnddc06...
> Unfortunately not. It is proprietary information.
> Do you think the error is coming from within the executed proc?
> Dave
> "Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> wrote in message
> news:edQytE$HHHA.420@.TK2MSFTNGP06.phx.gbl...
>|||Thanks for the idea but that didn't work.
I decided that since it worked on one machine and not the other that I would
compare Management Studio options and sure enough I found a difference. I
made the change and the error went away. Don't ask me why this particular
change fixed it because I don't know why.
Options-->Query Execution-->SQL Server-->Advanced
check SET CONCAT_NULL_YIELDS_NULL
Dave
"Mike C#" <xyz@.xyz.com> wrote in message
news:%23loaRX$HHHA.1468@.TK2MSFTNGP04.phx.gbl...
>I have a solution for you but unfortunately can't post it because it's
>proprietary.
> Just kidding. Try this and see if it works any better:
> EXEC _sp_CreateTempTable @.tablename = 'TableA', @.type = 'T'
> Make sure the SP is the same version (accepts the same # of parameters) on
> both servers.
> "DaveF" <dave@.aol.com> wrote in message
> news:5Ingh.8974$495.126@.trnddc06...
>|||"DaveF" <dave@.aol.com> wrote in message news:i2Hgh.51$hy6.13@.trnddc05...
> Thanks for the idea but that didn't work.
> I decided that since it worked on one machine and not the other that I
> would compare Management Studio options and sure enough I found a
> difference. I made the change and the error went away. Don't ask me why
> this particular change fixed it because I don't know why.
> Options-->Query Execution-->SQL Server-->Advanced
> check SET CONCAT_NULL_YIELDS_NULL
Sounds like there's a NULL value being concatenated into a string somewhere
in your SP, most likely in some dynamic SQL you're trying to execute. That
setting controls null concatenation in strings. When it's set to ON,
concatenating NULL to a string returns NULL. When it's set to OFF, when a
NULL is concatenated to a string it's treated like '' - an empty string.
Here's an example that shows the difference:
DECLARE @.table VARCHAR(255)
SELECT @.table = 'syscomments'
DECLARE @.sql VARCHAR(255)
SELECT @.sql = 'SELECT TOP 10 * FROM ' + @.table
SELECT 'Table: syscomments', @.sql
SET CONCAT_NULL_YIELDS_NULL ON
SELECT @.table = NULL
SELECT @.sql = 'SELECT TOP 10 * FROM ' + @.table
SELECT 'Table: NULL, CONCAT_NULL_YIELDS_NULL ON', @.sql
SET CONCAT_NULL_YIELDS_NULL OFF
SELECT @.table = NULL
SELECT @.sql = 'SELECT TOP 10 * FROM ' + @.table
SELECT 'Table: NULL, CONCAT_NULL_YIELDS_NULL ON', @.sql
GO
SET CONCAT_NULL_YIELDS_NULL ON
When CONCAT_NULL_YIELDS_NULL is ON, appending NULL to a string returns NULL.
SQL Server will EXEC a NULL dynamic SQL string without complaint. When
CONCAT_NULL_YIELDS_NULL is OFF, the table name in this SELECT query is
appended as an empty string which will cause an error when SQL Server tries
to execute it.
Based on what you said, it sounds like one of the dynamic SQL statements
you're trying to execute has a NULL value appended to it somewhere, and
that's causing your error when that setting is OFF. It also means that that
particular dynamic SQL statement you think is being executed is not being
executed. Might want to check all your dynamic SQL statements that should
be executing (SELECT or PRINT them out right before the EXEC statement).

EXEC stored procedure

I have a stored procedure that creates temp tables. It has two parameters,
@.tablename and @.type. I execute it with the following command.
EXEC _sp_CreateTempTable 'TableA', 'T'
With these parameters, this will create a temp table named ##TMP_TableA
based on the original table TableA.
The problem is when I run this on one of our servers it works and when I run
it on another I get the following error.
Msg 102, Level 15, State 1, Line 1
Incorrect syntax near ','.
Help!
DaveCan you post the proc?
--
Hope this helps.
Dan Guzman
SQL Server MVP
"DaveF" <dave@.aol.com> wrote in message news:5Ingh.8974$495.126@.trnddc06...
>I have a stored procedure that creates temp tables. It has two parameters,
>@.tablename and @.type. I execute it with the following command.
> EXEC _sp_CreateTempTable 'TableA', 'T'
> With these parameters, this will create a temp table named ##TMP_TableA
> based on the original table TableA.
> The problem is when I run this on one of our servers it works and when I
> run it on another I get the following error.
> Msg 102, Level 15, State 1, Line 1
> Incorrect syntax near ','.
> Help!
> Dave
>|||Unfortunately not. It is proprietary information.
Do you think the error is coming from within the executed proc?
Dave
"Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> wrote in message
news:edQytE$HHHA.420@.TK2MSFTNGP06.phx.gbl...
> Can you post the proc?
> --
> Hope this helps.
> Dan Guzman
> SQL Server MVP
> "DaveF" <dave@.aol.com> wrote in message
> news:5Ingh.8974$495.126@.trnddc06...
>>I have a stored procedure that creates temp tables. It has two
>>parameters, @.tablename and @.type. I execute it with the following command.
>> EXEC _sp_CreateTempTable 'TableA', 'T'
>> With these parameters, this will create a temp table named ##TMP_TableA
>> based on the original table TableA.
>> The problem is when I run this on one of our servers it works and when I
>> run it on another I get the following error.
>> Msg 102, Level 15, State 1, Line 1
>> Incorrect syntax near ','.
>> Help!
>> Dave
>|||> Do you think the error is coming from within the executed proc?
Yes. I suspect the proc is executing a dynamic SQL statement (since you are
passing the table name parameter) and that's where the error is.
--
Hope this helps.
Dan Guzman
SQL Server MVP
"DaveF" <dave@.aol.com> wrote in message news:6Ungh.9006$495.7049@.trnddc06...
> Unfortunately not. It is proprietary information.
> Do you think the error is coming from within the executed proc?
> Dave
> "Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> wrote in message
> news:edQytE$HHHA.420@.TK2MSFTNGP06.phx.gbl...
>> Can you post the proc?
>> --
>> Hope this helps.
>> Dan Guzman
>> SQL Server MVP
>> "DaveF" <dave@.aol.com> wrote in message
>> news:5Ingh.8974$495.126@.trnddc06...
>>I have a stored procedure that creates temp tables. It has two
>>parameters, @.tablename and @.type. I execute it with the following
>>command.
>> EXEC _sp_CreateTempTable 'TableA', 'T'
>> With these parameters, this will create a temp table named ##TMP_TableA
>> based on the original table TableA.
>> The problem is when I run this on one of our servers it works and when I
>> run it on another I get the following error.
>> Msg 102, Level 15, State 1, Line 1
>> Incorrect syntax near ','.
>> Help!
>> Dave
>>
>|||I have a solution for you but unfortunately can't post it because it's
proprietary.
Just kidding. Try this and see if it works any better:
EXEC _sp_CreateTempTable @.tablename = 'TableA', @.type = 'T'
Make sure the SP is the same version (accepts the same # of parameters) on
both servers.
"DaveF" <dave@.aol.com> wrote in message news:5Ingh.8974$495.126@.trnddc06...
>I have a stored procedure that creates temp tables. It has two parameters,
>@.tablename and @.type. I execute it with the following command.
> EXEC _sp_CreateTempTable 'TableA', 'T'
> With these parameters, this will create a temp table named ##TMP_TableA
> based on the original table TableA.
> The problem is when I run this on one of our servers it works and when I
> run it on another I get the following error.
> Msg 102, Level 15, State 1, Line 1
> Incorrect syntax near ','.
> Help!
> Dave
>|||Thanks for the idea but that didn't work.
I decided that since it worked on one machine and not the other that I would
compare Management Studio options and sure enough I found a difference. I
made the change and the error went away. Don't ask me why this particular
change fixed it because I don't know why.
Options-->Query Execution-->SQL Server-->Advanced
check SET CONCAT_NULL_YIELDS_NULL
Dave
"Mike C#" <xyz@.xyz.com> wrote in message
news:%23loaRX$HHHA.1468@.TK2MSFTNGP04.phx.gbl...
>I have a solution for you but unfortunately can't post it because it's
>proprietary.
> Just kidding. Try this and see if it works any better:
> EXEC _sp_CreateTempTable @.tablename = 'TableA', @.type = 'T'
> Make sure the SP is the same version (accepts the same # of parameters) on
> both servers.
> "DaveF" <dave@.aol.com> wrote in message
> news:5Ingh.8974$495.126@.trnddc06...
>>I have a stored procedure that creates temp tables. It has two
>>parameters, @.tablename and @.type. I execute it with the following command.
>> EXEC _sp_CreateTempTable 'TableA', 'T'
>> With these parameters, this will create a temp table named ##TMP_TableA
>> based on the original table TableA.
>> The problem is when I run this on one of our servers it works and when I
>> run it on another I get the following error.
>> Msg 102, Level 15, State 1, Line 1
>> Incorrect syntax near ','.
>> Help!
>> Dave
>|||"DaveF" <dave@.aol.com> wrote in message news:i2Hgh.51$hy6.13@.trnddc05...
> Thanks for the idea but that didn't work.
> I decided that since it worked on one machine and not the other that I
> would compare Management Studio options and sure enough I found a
> difference. I made the change and the error went away. Don't ask me why
> this particular change fixed it because I don't know why.
> Options-->Query Execution-->SQL Server-->Advanced
> check SET CONCAT_NULL_YIELDS_NULL
Sounds like there's a NULL value being concatenated into a string somewhere
in your SP, most likely in some dynamic SQL you're trying to execute. That
setting controls null concatenation in strings. When it's set to ON,
concatenating NULL to a string returns NULL. When it's set to OFF, when a
NULL is concatenated to a string it's treated like '' - an empty string.
Here's an example that shows the difference:
DECLARE @.table VARCHAR(255)
SELECT @.table = 'syscomments'
DECLARE @.sql VARCHAR(255)
SELECT @.sql = 'SELECT TOP 10 * FROM ' + @.table
SELECT 'Table: syscomments', @.sql
SET CONCAT_NULL_YIELDS_NULL ON
SELECT @.table = NULL
SELECT @.sql = 'SELECT TOP 10 * FROM ' + @.table
SELECT 'Table: NULL, CONCAT_NULL_YIELDS_NULL ON', @.sql
SET CONCAT_NULL_YIELDS_NULL OFF
SELECT @.table = NULL
SELECT @.sql = 'SELECT TOP 10 * FROM ' + @.table
SELECT 'Table: NULL, CONCAT_NULL_YIELDS_NULL ON', @.sql
GO
SET CONCAT_NULL_YIELDS_NULL ON
When CONCAT_NULL_YIELDS_NULL is ON, appending NULL to a string returns NULL.
SQL Server will EXEC a NULL dynamic SQL string without complaint. When
CONCAT_NULL_YIELDS_NULL is OFF, the table name in this SELECT query is
appended as an empty string which will cause an error when SQL Server tries
to execute it.
Based on what you said, it sounds like one of the dynamic SQL statements
you're trying to execute has a NULL value appended to it somewhere, and
that's causing your error when that setting is OFF. It also means that that
particular dynamic SQL statement you think is being executed is not being
executed. Might want to check all your dynamic SQL statements that should
be executing (SELECT or PRINT them out right before the EXEC statement).

exec stored proc and use udf or select in it

Hi,
I have a stored procedure which has a cursor that populates a table
datatype variabel with few fields on eof which is of type [nText].
Now for each row in the above table type variabel I have to call
another stored proc to which I have to pass the [nText] field value.
Can anyone suggest how to do it?
Thanks,
Vikas.Could you call the stored procedure passing the ntext value before inserting
it into the table variable? You could call it after inserting as well, but
before fetching the next value from the cursor.
"vikassah@.gmail.com" wrote:

> Hi,
> I have a stored procedure which has a cursor that populates a table
> datatype variabel with few fields on eof which is of type [nText].
> Now for each row in the above table type variabel I have to call
> another stored proc to which I have to pass the [nText] field value.
> Can anyone suggest how to do it?
> Thanks,
> Vikas.
>|||(vikassah@.gmail.com) writes:
> I have a stored procedure which has a cursor that populates a table
> datatype variabel with few fields on eof which is of type [nText].
> Now for each row in the above table type variabel I have to call
> another stored proc to which I have to pass the [nText] field value.
> Can anyone suggest how to do it?
I don't really understand what you are trying to do. Maybe you could
be more specific?
Note, though that you cannot assign to variables of the type ntext,
so you may have to rework your strategy.
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx

Monday, March 26, 2012

EXEC Command problem

Tried to do a search on EXEC and got a lot of hits, but none that I saw
addressed my problem.
I'm trying to move a DB, but when I type,
EXEC sp_detach_db 'database'
I get the unrecognized command error.
I did a search for EXEC, but couldn't find it.
Any suggestions/assistance?
Thanks.
hi,
"BobCo" <BobCo@.discussions.microsoft.com> ha scritto nel messaggio
news:3B1E774A-0376-4CE3-B7D3-6BF054155724@.microsoft.com
> Tried to do a search on EXEC and got a lot of hits, but none that I
> saw addressed my problem.
> I'm trying to move a DB, but when I type,
> EXEC sp_detach_db 'database'
> I get the unrecognized command error.
> I did a search for EXEC, but couldn't find it.
> Any suggestions/assistance?
> Thanks.
EXEC is just execute the stored procedure (the system stored procedure, in
this case), and is required when the call to stored procedure is not the
very first line of the current Transact-SQL batch
so the problem is not EXEC ...
actually you usually should get
Server: Msg 229, Level 14, State 5, Procedure sp_detach_db, Line 50
EXECUTE permission denied on object 'sp_detach_db', database 'master', owner
'dbo'.
if your current user has not been granted membership to sysadmins server
fixed role, as only relative members are allowed to execute that system
stored procedure... further details at
http://msdn.microsoft.com/library/de...da-di_83fm.asp
Andrea Montanari (Microsoft MVP - SQL Server)
http://www.asql.biz/DbaMgr.shtmhttp://italy.mvps.org
DbaMgr2k ver 0.9.1 - DbaMgr ver 0.55.1
(my vb6+sql-dmo little try to provide MS MSDE 1.0 and MSDE 2000 a visual
interface)
-- remove DMO to reply
|||Hi ,
The Exec command is just optional and the call to the stored procedure
sp_detach_db command completes even if the exec prefix is ommitted.
Also you can also try using the Enterprise manager to do the same task.
Girish Sundaram
This posting is provided "AS IS" with no warranties, and confers no rights.
|||hi Girish,
"Girish Sundaram" <girishs@.microsoft.com> ha scritto nel messaggio
news:MjgS8Dl7EHA.2600@.cpmsftngxa10.phx.gbl
> Hi ,
> The Exec command is just optional and the call to the stored procedure
> sp_detach_db command completes even if the exec prefix is ommitted.
> Also you can also try using the Enterprise manager to do the same
> task.
> Girish Sundaram
> This posting is provided "AS IS" with no warranties, and confers no
> rights.
actually you can't... AFAIK, EM is not allowed to be used in conjunction
with MSDE instances..
Andrea Montanari (Microsoft MVP - SQL Server)
http://www.asql.biz/DbaMgr.shtmhttp://italy.mvps.org
DbaMgr2k ver 0.9.1 - DbaMgr ver 0.55.1
(my vb6+sql-dmo little try to provide MS MSDE 1.0 and MSDE 2000 a visual
interface)
-- remove DMO to reply

Wednesday, March 21, 2012

Exclude User Defined Data Type owner from script

I need to exclude the User Defined Data Type owner when I script out my tables. I have set the ScriptingOptions.SchemaQualify = false which removes the owner from the table declaration but not from the User Defined Data Type.

I currently have to do a search and replace in order to remove all the [dbo].[MyDataType] from the CREATE TABLE scripts.

Any ideas how to fix this when creating my table script in SMO?

Given that the schema is part of the security framework of the database, why would you want to remove the schema from the create script? It would make the script unusable as a recovery tool.

Exclude User Defined Data Type owner from script

I need to exclude the User Defined Data Type owner when I script out my tables. I have set the ScriptingOptions.SchemaQualify = false which removes the owner from the table declaration but not from the User Defined Data Type.

I currently have to do a search and replace in order to remove all the [dbo].[MyDataType] from the CREATE TABLE scripts.

Any ideas how to fix this when creating my table script in SMO?

Given that the schema is part of the security framework of the database, why would you want to remove the schema from the create script? It would make the script unusable as a recovery tool.

Wednesday, March 7, 2012

Exception of type 'System.OutOfMemoryException' was thrown. (Microsoft.SqlServer.SmoEnum)

I just received this message:

Exception of type 'System.OutOfMemoryException' was thrown. (Microsoft.SqlServer.SmoEnum)

I am completely dead in the water. Cannot connect to SSAS 2005. HELP!

hello,

it is not quite clear what are you trying to do when the exception happens. Is the system really running low on memory?

the comment above seems to suggest that you are trying to connect to SSAS, however the exception seems to come from SMO (?) which is an OM for working with SQL server (relational).

if the issue is that you are unable to connect to MSAS, could you please provide more information as to what are the sequence of actions you take (also how you connect -- from Management Studio or in some different way; if from Management Studio - what do you enter in connection dialog etc.), and more information about when the exception occures and stack trace and other info about the error (in management studio error msg usually has "Show Technical Details" button)?

thanks a lot,

|||

I am trying to do the initial load of SSAS 2005. Just loading the Microsoft SQL Server\90\Tools\Binn\VSShell\Common7\IDE\SqlWb.exe executable to get to the SSAS databases is just stopped. I had to "split" the SSAS databses to get them imported into three splits. The SSAS 2000 has 35 dimensions, 49 cubes, and 4 virtual FACT cubes. This was load #2 out of 3. I had just completed the cube builds on 31 out of 49 cubes. Had just completed 138 million rows of populating the SSAS 2k5 database. I had done tests that put the total row count to 240 to 280 million rows when completed. The size of the total SSAS DB is currently projected at 16-20 GB when completed. I tried to re-impliment the 2nd set of cubes and now SSAS 2k5 will not even start. I had tried the /PAE as well as the /3GB swtiches with no success. I tried each switch individually with full reboot each. No success. . Help.

|||

Update. I took the switch /3GB OUT of the boot.ini and was successful in getting the SS Management Studio to at least start. Now, as I try to open cubes to work with them in Management Studio, it gives me the following error:

Excetion of type 'System.OutOfMemoryException' was thrown. (Microsoft.SqlServer.SmoEnum)

Is there a 2GB memory limit in SSAS? When you put in the /3GB switch it actually fails to load initially. Is there some caching/hashing algorythm that allows the standard non /3GB switch to be successfule and the /3GB switch enablement to fail in high-memory usage cases? This is a bit perplexing. The 2000 OLAP database runs at about 1.7 GB TOTAL in production today (the source for the migration wizard), and it is stable. Using the Database Migration Wizard (9.0.2047.0) I could NOT get it to migrate the whole 2k OLAP database. It has the memory exception error. I had to "split" it up (Migtration Wizard) to get the the dimensions and cubes to load by chunking it into SSAS 2k5. I was successful on the first half of the 2k OLAP migration to SSAS 2k5 mmigrating 31 out of 49 cubes. I have subsequently loaded all of the dimensions (all 35 from above) with data and cubes (about 31 cubes with about 137 million rows total) with data. I tried to migrate the other outstanding 16 cubes and all errors started happening and SSAS 2k5 Management Studio is failing to even load. My hardware/OS config is as follows:

C drive - 10 GB (RAID 1)

D drive - 8 GB (4095 pagefile is here) (RAID 10)

E drive - 50GB (RAID 10)

RAM - 8GB

OS - Server 2003 Enterprise (Open License Version 5.2 Build 3790.srv_03_sp1_rtm.050324-1447: ServicePack 1)

DB - SQL Server 2005 Enterprise - SP1/041806 installed 042006 - Full install of all components. I have stopped all SQL Services accept SQL Browser and SSAS services.

Does anyone else have this issue with SSAS 2005 and hitting the ceiling-memory limit?

How can 2000 OLAP run at 1.7 GB and be stable and SSAS 2k5 fails to even load the exact same data using Microsoft's own toolset?

I am seeking to find the solution to this issue and are on an extremely aggreesive schedule. HELP!

|||

Okay,

I have tried something to see if it will work and here is where I am at. I cleared the transferred SSAS DB from Management Studio completely. I re-migrated (in three MigratioWizard.exe sessions) most (32 dimensions and 38 cubes) of the existing objects into a new "stand-alone" SSAS 2K5 database. I then "imported" the now almost fully deployed (all that is left are the FACT tables) SSAS 2K5 into BI Studio successfully using the import funtion in BI Studio! I then backed up and deleted the SSAS 2K5 DB from Management Studio. Now the bad news. I was able to get the BI project to "build" with no errors. Now, I am back to the SAME Exception of type

'System.OutOfMemoryException' was thrown

error I had when only "trying" to partially migrate and populate even 50% of the cubes in Management Studio (see previous postings). Now, this same error shows up under the errors bottom-left dialog box in BI studio with no other explanantions. I have been combing the internet trying to get a solution. Can anyone out there help?

|||

We have a defect in RTM and SP1 builds, that OutOfMemoryException occurs when SQL Management Studio connects to AS2005 with large database(s). We are fixing this for SP2.

Meanwhile, the work-around would be to use Visual Studio to connect to AS2005 to administer or process objects.

Or to temporary separate databases on different AS2005 instances (then script each of them and run the scripts to combine them back on a single AS2005 instance).

Adrian Dumitrascu.

|||

The deimensions and cubes are now in ONE instance of BI Studio 2005. There are no more SQL or ANY databases that are in this box or connected to this instance. I am running a DELL 2850 with DUAL 3.8 Ghz procs, 8GB or RAM, and ALL data partitions that SQL 2K5 touches are RAID 10/15k hard drives. This should be "flying" on 2K5 from all of the press. These databases are running perfectly on SQL 2000 AS at 1.7 GB of RAN with no issues. I want to be crystal-clear with my response. The objects are ALL now in BI studio (definitions of only dimensions and cubes) and the SQL 2K5 BI studio throws the SAME error whether I try to use the migration tool OR try to deploy using BI Studio. I need help or this is going to be a Microsoft 2K5 disaster. Is there some work around to data sizes coming from 2000 AS? I cannot be the ONLY person that has seen this complete stoppage. If you wish to continue on this most-excellent quest to actually see the realization of the intent of "Project-REAL" then please give me a ring at 512.845.4950 or reply to this so that we can overcome this very intense (and complete stoppage) road block. Please help. I do not mean to be forward and apologize. This project should have taken approximately 60-80 hours and it is now well into the hundreds of hours. I am merely a fellow MS-er just trying to overcome adversity and see this through to success for us all.

:)

|||

I will investigate this and will try to reproduce it on our side to debug and see if we can provide a work-around or a quick fix.

You migrated the AS2000 database with Migration Wizard ? If yes, can you run the wizard again to generate the migration script ? (there is an option on the first page to either migrate to AS2005 or generate the script). Once you have the script, please archive it as .zip and send it to me at adumitrascu at hotmail dot com.

Adrian Dumitrascu.

|||

I forgot to mention, please double check and remove sensitive meta-data in the AS2000 database and the migration script. Especially verify the connection strings for data sources (for passwords - just change the connection strings in the script to bogus values, I don't need them to be correct). Also, if you have MDX calculations that you want to keep private, please remove them from the script (search for "<Commands>" in the script to review them).

Adrian Dumitrascu.

|||

Adrian,

Thanks again for your help here. You have been great. We do have some issues with running out of memory that I will be posting. Here is the error:


Errors and Warnings from Response
Memory error: Allocation failure : Not enough storage is available to process this command. .
Memory error: Allocation failure : Not enough storage is available to process this command. .
Errors in the metadata manager. An error occurred when loading the AggregationDesign aggregation design, from the file, '\\?\E:\Program Files\Microsoft SQL Server\MSSQL.2\OLAP\Data\Business Report System ADW.1.db\TA_2003.0.cub\TA_2003.0.det\AggregationDesign.0.agg.xml'.
Error Code = 0x8007000E, External Code = 0x00000000:.

What can I do now about this? We seem to be hitting every roadblock that can be during this POC. Any assistance will be greatly appreciated.

krj

|||

Hi Kjr,

I have the same problem that you had before the last post.

What solution you had about this problem System.OutOfMemoryException with MigrationWizard? You can help me?

tks,

Eder F. Dias

|||

We are actually still working on the issue. The last recommendation was to move UP to the 64-BIT version of SSAS 2k5. As we continue to peice-meal the cubes into the 32-BIT solution there is an entire discussion about maintenace and care of the SSAS DB going forward. Given the application memory "ceiling" that you may be experiencing it may make sense to try the 64-bit version of SSAS 2k5. This would, as I understand it, help to allocate the required memory in a manner that would potentially allow thw application to run more freely.

krj

|||

Thanks Kjr for answer.

Actually my environment is:

- Intel Xeon CPu 3.4 GHZ, 3.93GB RAM

- Windows 2003 Enterprise x64 SP1

- SQL Server 2005 Enterprise x64 (SP1 - build 2153)

How you can see I working in a 64-Bit Environment, but the exception still happen. I don't know why. Maybe my environment isn't the best? What can i do for became it better?

Somethig strange is that the MigrationWizard.exe process work in a 32-Bit even in x64 enviroment. There is another application that work in 64-Bit?

More informations about my server:

OS Name Microsoft(R) Windows(R) Server 2003 Enterprise x64 Edition
Version 5.2.3790 Service Pack 1 Build 3790
Other OS Description Not Available
OS Manufacturer Microsoft Corporation
System Name SRVBRASIL31
System Manufacturer HP
System Model ProLiant DL380 G4
System Type x64-based PC
Processor EM64T Family 15 Model 4 Stepping 3 GenuineIntel ~3400 Mhz
Processor EM64T Family 15 Model 4 Stepping 3 GenuineIntel ~3400 Mhz
Processor EM64T Family 15 Model 4 Stepping 3 GenuineIntel ~3400 Mhz
Processor EM64T Family 15 Model 4 Stepping 3 GenuineIntel ~3400 Mhz
BIOS Version/Date HP P51, 10/10/2005
SMBIOS Version 2.3
Windows Directory C:\WINDOWS
System Directory C:\WINDOWS\system32
Boot Device \Device\HarddiskVolume1
Locale United States
Hardware Abstraction Layer Version = "5.2.3790.1830 (srv03_sp1_rtm.050324-1447)"
User Name Not Available
Time Zone E. South America Standard Time
Total Physical Memory 4,031.28 MB
Available Physical Memory 1.70 GB
Total Virtual Memory 5.69 GB
Available Virtual Memory 3.23 GB
Page File Space 2.00 GB
Page File C:\pagefile.sys

tks a lot

Eder F Dias

|||

Adrian,

You mentioned there is a fix for this coming in SP2. Is this fix available as a Hotfix now or only as a part of SP2?

- Steve

|||

Steve,

I had varying degrees of success and are too trying to keep things running and get them stable. What I had done to extend the envelope is to try give the "system" as much "resources" as I could. First, I had 8 GB of RAM to your 4 GB. This may be a critical point. Remember, in most cases WIndows needs a memory to run as an operating system. In my case on the 32-bit platform this was around 600/700 MB for just the OS. On the 64-bit I am not so sure of the base OS memory requirements. For SQL 2K5 though (on the 32-bit platform) it was recommneded to me to put in the /3GB switch in the boot.ini for SQL 2K5 to allow for higher memory addressing. Also there is an AWE switch in "spconfig" (properties of the server in SQL 2K5) to allow for this memory to be used.

The reason that I am going over this with you is that it seems you are having the EXACT behavior in the 64-bit version that I ma having in the 32-bit version. It seemed logical to assume that I was hitting the wall because of the 32-bit addressing limitations (as told to me by Microsoft). I will continue on the next post.

Exception of type 'System.OutOfMemoryException' was thrown. (Microsoft.SqlServer.SmoEnum)

I just received this message:

Exception of type 'System.OutOfMemoryException' was thrown. (Microsoft.SqlServer.SmoEnum)

I am completely dead in the water. Cannot connect to SSAS 2005. HELP!

hello,

it is not quite clear what are you trying to do when the exception happens. Is the system really running low on memory?

the comment above seems to suggest that you are trying to connect to SSAS, however the exception seems to come from SMO (?) which is an OM for working with SQL server (relational).

if the issue is that you are unable to connect to MSAS, could you please provide more information as to what are the sequence of actions you take (also how you connect -- from Management Studio or in some different way; if from Management Studio - what do you enter in connection dialog etc.), and more information about when the exception occures and stack trace and other info about the error (in management studio error msg usually has "Show Technical Details" button)?

thanks a lot,

|||

I am trying to do the initial load of SSAS 2005. Just loading the Microsoft SQL Server\90\Tools\Binn\VSShell\Common7\IDE\SqlWb.exe executable to get to the SSAS databases is just stopped. I had to "split" the SSAS databses to get them imported into three splits. The SSAS 2000 has 35 dimensions, 49 cubes, and 4 virtual FACT cubes. This was load #2 out of 3. I had just completed the cube builds on 31 out of 49 cubes. Had just completed 138 million rows of populating the SSAS 2k5 database. I had done tests that put the total row count to 240 to 280 million rows when completed. The size of the total SSAS DB is currently projected at 16-20 GB when completed. I tried to re-impliment the 2nd set of cubes and now SSAS 2k5 will not even start. I had tried the /PAE as well as the /3GB swtiches with no success. I tried each switch individually with full reboot each. No success. . Help.

|||

Update. I took the switch /3GB OUT of the boot.ini and was successful in getting the SS Management Studio to at least start. Now, as I try to open cubes to work with them in Management Studio, it gives me the following error:

Excetion of type 'System.OutOfMemoryException' was thrown. (Microsoft.SqlServer.SmoEnum)

Is there a 2GB memory limit in SSAS? When you put in the /3GB switch it actually fails to load initially. Is there some caching/hashing algorythm that allows the standard non /3GB switch to be successfule and the /3GB switch enablement to fail in high-memory usage cases? This is a bit perplexing. The 2000 OLAP database runs at about 1.7 GB TOTAL in production today (the source for the migration wizard), and it is stable. Using the Database Migration Wizard (9.0.2047.0) I could NOT get it to migrate the whole 2k OLAP database. It has the memory exception error. I had to "split" it up (Migtration Wizard) to get the the dimensions and cubes to load by chunking it into SSAS 2k5. I was successful on the first half of the 2k OLAP migration to SSAS 2k5 mmigrating 31 out of 49 cubes. I have subsequently loaded all of the dimensions (all 35 from above) with data and cubes (about 31 cubes with about 137 million rows total) with data. I tried to migrate the other outstanding 16 cubes and all errors started happening and SSAS 2k5 Management Studio is failing to even load. My hardware/OS config is as follows:

C drive - 10 GB (RAID 1)

D drive - 8 GB (4095 pagefile is here) (RAID 10)

E drive - 50GB (RAID 10)

RAM - 8GB

OS - Server 2003 Enterprise (Open License Version 5.2 Build 3790.srv_03_sp1_rtm.050324-1447: ServicePack 1)

DB - SQL Server 2005 Enterprise - SP1/041806 installed 042006 - Full install of all components. I have stopped all SQL Services accept SQL Browser and SSAS services.

Does anyone else have this issue with SSAS 2005 and hitting the ceiling-memory limit?

How can 2000 OLAP run at 1.7 GB and be stable and SSAS 2k5 fails to even load the exact same data using Microsoft's own toolset?

I am seeking to find the solution to this issue and are on an extremely aggreesive schedule. HELP!

|||

Okay,

I have tried something to see if it will work and here is where I am at. I cleared the transferred SSAS DB from Management Studio completely. I re-migrated (in three MigratioWizard.exe sessions) most (32 dimensions and 38 cubes) of the existing objects into a new "stand-alone" SSAS 2K5 database. I then "imported" the now almost fully deployed (all that is left are the FACT tables) SSAS 2K5 into BI Studio successfully using the import funtion in BI Studio! I then backed up and deleted the SSAS 2K5 DB from Management Studio. Now the bad news. I was able to get the BI project to "build" with no errors. Now, I am back to the SAME Exception of type

'System.OutOfMemoryException' was thrown

error I had when only "trying" to partially migrate and populate even 50% of the cubes in Management Studio (see previous postings). Now, this same error shows up under the errors bottom-left dialog box in BI studio with no other explanantions. I have been combing the internet trying to get a solution. Can anyone out there help?

|||

We have a defect in RTM and SP1 builds, that OutOfMemoryException occurs when SQL Management Studio connects to AS2005 with large database(s). We are fixing this for SP2.

Meanwhile, the work-around would be to use Visual Studio to connect to AS2005 to administer or process objects.

Or to temporary separate databases on different AS2005 instances (then script each of them and run the scripts to combine them back on a single AS2005 instance).

Adrian Dumitrascu.

|||

The deimensions and cubes are now in ONE instance of BI Studio 2005. There are no more SQL or ANY databases that are in this box or connected to this instance. I am running a DELL 2850 with DUAL 3.8 Ghz procs, 8GB or RAM, and ALL data partitions that SQL 2K5 touches are RAID 10/15k hard drives. This should be "flying" on 2K5 from all of the press. These databases are running perfectly on SQL 2000 AS at 1.7 GB of RAN with no issues. I want to be crystal-clear with my response. The objects are ALL now in BI studio (definitions of only dimensions and cubes) and the SQL 2K5 BI studio throws the SAME error whether I try to use the migration tool OR try to deploy using BI Studio. I need help or this is going to be a Microsoft 2K5 disaster. Is there some work around to data sizes coming from 2000 AS? I cannot be the ONLY person that has seen this complete stoppage. If you wish to continue on this most-excellent quest to actually see the realization of the intent of "Project-REAL" then please give me a ring at 512.845.4950 or reply to this so that we can overcome this very intense (and complete stoppage) road block. Please help. I do not mean to be forward and apologize. This project should have taken approximately 60-80 hours and it is now well into the hundreds of hours. I am merely a fellow MS-er just trying to overcome adversity and see this through to success for us all.

:)

|||

I will investigate this and will try to reproduce it on our side to debug and see if we can provide a work-around or a quick fix.

You migrated the AS2000 database with Migration Wizard ? If yes, can you run the wizard again to generate the migration script ? (there is an option on the first page to either migrate to AS2005 or generate the script). Once you have the script, please archive it as .zip and send it to me at adumitrascu at hotmail dot com.

Adrian Dumitrascu.

|||

I forgot to mention, please double check and remove sensitive meta-data in the AS2000 database and the migration script. Especially verify the connection strings for data sources (for passwords - just change the connection strings in the script to bogus values, I don't need them to be correct). Also, if you have MDX calculations that you want to keep private, please remove them from the script (search for "<Commands>" in the script to review them).

Adrian Dumitrascu.

|||

Adrian,

Thanks again for your help here. You have been great. We do have some issues with running out of memory that I will be posting. Here is the error:


Errors and Warnings from Response
Memory error: Allocation failure : Not enough storage is available to process this command. .
Memory error: Allocation failure : Not enough storage is available to process this command. .
Errors in the metadata manager. An error occurred when loading the AggregationDesign aggregation design, from the file, '\\?\E:\Program Files\Microsoft SQL Server\MSSQL.2\OLAP\Data\Business Report System ADW.1.db\TA_2003.0.cub\TA_2003.0.det\AggregationDesign.0.agg.xml'.
Error Code = 0x8007000E, External Code = 0x00000000:.

What can I do now about this? We seem to be hitting every roadblock that can be during this POC. Any assistance will be greatly appreciated.

krj

|||

Hi Kjr,

I have the same problem that you had before the last post.

What solution you had about this problem System.OutOfMemoryException with MigrationWizard? You can help me?

tks,

Eder F. Dias

|||

We are actually still working on the issue. The last recommendation was to move UP to the 64-BIT version of SSAS 2k5. As we continue to peice-meal the cubes into the 32-BIT solution there is an entire discussion about maintenace and care of the SSAS DB going forward. Given the application memory "ceiling" that you may be experiencing it may make sense to try the 64-bit version of SSAS 2k5. This would, as I understand it, help to allocate the required memory in a manner that would potentially allow thw application to run more freely.

krj

|||

Thanks Kjr for answer.

Actually my environment is:

- Intel Xeon CPu 3.4 GHZ, 3.93GB RAM

- Windows 2003 Enterprise x64 SP1

- SQL Server 2005 Enterprise x64 (SP1 - build 2153)

How you can see I working in a 64-Bit Environment, but the exception still happen. I don't know why. Maybe my environment isn't the best? What can i do for became it better?

Somethig strange is that the MigrationWizard.exe process work in a 32-Bit even in x64 enviroment. There is another application that work in 64-Bit?

More informations about my server:

OS Name Microsoft(R) Windows(R) Server 2003 Enterprise x64 Edition
Version 5.2.3790 Service Pack 1 Build 3790
Other OS Description Not Available
OS Manufacturer Microsoft Corporation
System Name SRVBRASIL31
System Manufacturer HP
System Model ProLiant DL380 G4
System Type x64-based PC
Processor EM64T Family 15 Model 4 Stepping 3 GenuineIntel ~3400 Mhz
Processor EM64T Family 15 Model 4 Stepping 3 GenuineIntel ~3400 Mhz
Processor EM64T Family 15 Model 4 Stepping 3 GenuineIntel ~3400 Mhz
Processor EM64T Family 15 Model 4 Stepping 3 GenuineIntel ~3400 Mhz
BIOS Version/Date HP P51, 10/10/2005
SMBIOS Version 2.3
Windows Directory C:\WINDOWS
System Directory C:\WINDOWS\system32
Boot Device \Device\HarddiskVolume1
Locale United States
Hardware Abstraction Layer Version = "5.2.3790.1830 (srv03_sp1_rtm.050324-1447)"
User Name Not Available
Time Zone E. South America Standard Time
Total Physical Memory 4,031.28 MB
Available Physical Memory 1.70 GB
Total Virtual Memory 5.69 GB
Available Virtual Memory 3.23 GB
Page File Space 2.00 GB
Page File C:\pagefile.sys

tks a lot

Eder F Dias

|||

Adrian,

You mentioned there is a fix for this coming in SP2. Is this fix available as a Hotfix now or only as a part of SP2?

- Steve

|||

Steve,

I had varying degrees of success and are too trying to keep things running and get them stable. What I had done to extend the envelope is to try give the "system" as much "resources" as I could. First, I had 8 GB of RAM to your 4 GB. This may be a critical point. Remember, in most cases WIndows needs a memory to run as an operating system. In my case on the 32-bit platform this was around 600/700 MB for just the OS. On the 64-bit I am not so sure of the base OS memory requirements. For SQL 2K5 though (on the 32-bit platform) it was recommneded to me to put in the /3GB switch in the boot.ini for SQL 2K5 to allow for higher memory addressing. Also there is an AWE switch in "spconfig" (properties of the server in SQL 2K5) to allow for this memory to be used.

The reason that I am going over this with you is that it seems you are having the EXACT behavior in the 64-bit version that I ma having in the 32-bit version. It seemed logical to assume that I was hitting the wall because of the 32-bit addressing limitations (as told to me by Microsoft). I will continue on the next post.

Exception of type 'System.OutOfMemoryException' was thrown. (Microsoft.SqlServer.SmoEnum)

I just received this message:

Exception of type 'System.OutOfMemoryException' was thrown. (Microsoft.SqlServer.SmoEnum)

I am completely dead in the water. Cannot connect to SSAS 2005. HELP!

hello,

it is not quite clear what are you trying to do when the exception happens. Is the system really running low on memory?

the comment above seems to suggest that you are trying to connect to SSAS, however the exception seems to come from SMO (?) which is an OM for working with SQL server (relational).

if the issue is that you are unable to connect to MSAS, could you please provide more information as to what are the sequence of actions you take (also how you connect -- from Management Studio or in some different way; if from Management Studio - what do you enter in connection dialog etc.), and more information about when the exception occures and stack trace and other info about the error (in management studio error msg usually has "Show Technical Details" button)?

thanks a lot,

|||

I am trying to do the initial load of SSAS 2005. Just loading the Microsoft SQL Server\90\Tools\Binn\VSShell\Common7\IDE\SqlWb.exe executable to get to the SSAS databases is just stopped. I had to "split" the SSAS databses to get them imported into three splits. The SSAS 2000 has 35 dimensions, 49 cubes, and 4 virtual FACT cubes. This was load #2 out of 3. I had just completed the cube builds on 31 out of 49 cubes. Had just completed 138 million rows of populating the SSAS 2k5 database. I had done tests that put the total row count to 240 to 280 million rows when completed. The size of the total SSAS DB is currently projected at 16-20 GB when completed. I tried to re-impliment the 2nd set of cubes and now SSAS 2k5 will not even start. I had tried the /PAE as well as the /3GB swtiches with no success. I tried each switch individually with full reboot each. No success. . Help.

|||

Update. I took the switch /3GB OUT of the boot.ini and was successful in getting the SS Management Studio to at least start. Now, as I try to open cubes to work with them in Management Studio, it gives me the following error:

Excetion of type 'System.OutOfMemoryException' was thrown. (Microsoft.SqlServer.SmoEnum)

Is there a 2GB memory limit in SSAS? When you put in the /3GB switch it actually fails to load initially. Is there some caching/hashing algorythm that allows the standard non /3GB switch to be successfule and the /3GB switch enablement to fail in high-memory usage cases? This is a bit perplexing. The 2000 OLAP database runs at about 1.7 GB TOTAL in production today (the source for the migration wizard), and it is stable. Using the Database Migration Wizard (9.0.2047.0) I could NOT get it to migrate the whole 2k OLAP database. It has the memory exception error. I had to "split" it up (Migtration Wizard) to get the the dimensions and cubes to load by chunking it into SSAS 2k5. I was successful on the first half of the 2k OLAP migration to SSAS 2k5 mmigrating 31 out of 49 cubes. I have subsequently loaded all of the dimensions (all 35 from above) with data and cubes (about 31 cubes with about 137 million rows total) with data. I tried to migrate the other outstanding 16 cubes and all errors started happening and SSAS 2k5 Management Studio is failing to even load. My hardware/OS config is as follows:

C drive - 10 GB (RAID 1)

D drive - 8 GB (4095 pagefile is here) (RAID 10)

E drive - 50GB (RAID 10)

RAM - 8GB

OS - Server 2003 Enterprise (Open License Version 5.2 Build 3790.srv_03_sp1_rtm.050324-1447: ServicePack 1)

DB - SQL Server 2005 Enterprise - SP1/041806 installed 042006 - Full install of all components. I have stopped all SQL Services accept SQL Browser and SSAS services.

Does anyone else have this issue with SSAS 2005 and hitting the ceiling-memory limit?

How can 2000 OLAP run at 1.7 GB and be stable and SSAS 2k5 fails to even load the exact same data using Microsoft's own toolset?

I am seeking to find the solution to this issue and are on an extremely aggreesive schedule. HELP!

|||

Okay,

I have tried something to see if it will work and here is where I am at. I cleared the transferred SSAS DB from Management Studio completely. I re-migrated (in three MigratioWizard.exe sessions) most (32 dimensions and 38 cubes) of the existing objects into a new "stand-alone" SSAS 2K5 database. I then "imported" the now almost fully deployed (all that is left are the FACT tables) SSAS 2K5 into BI Studio successfully using the import funtion in BI Studio! I then backed up and deleted the SSAS 2K5 DB from Management Studio. Now the bad news. I was able to get the BI project to "build" with no errors. Now, I am back to the SAME Exception of type

'System.OutOfMemoryException' was thrown

error I had when only "trying" to partially migrate and populate even 50% of the cubes in Management Studio (see previous postings). Now, this same error shows up under the errors bottom-left dialog box in BI studio with no other explanantions. I have been combing the internet trying to get a solution. Can anyone out there help?

|||

We have a defect in RTM and SP1 builds, that OutOfMemoryException occurs when SQL Management Studio connects to AS2005 with large database(s). We are fixing this for SP2.

Meanwhile, the work-around would be to use Visual Studio to connect to AS2005 to administer or process objects.

Or to temporary separate databases on different AS2005 instances (then script each of them and run the scripts to combine them back on a single AS2005 instance).

Adrian Dumitrascu.

|||

The deimensions and cubes are now in ONE instance of BI Studio 2005. There are no more SQL or ANY databases that are in this box or connected to this instance. I am running a DELL 2850 with DUAL 3.8 Ghz procs, 8GB or RAM, and ALL data partitions that SQL 2K5 touches are RAID 10/15k hard drives. This should be "flying" on 2K5 from all of the press. These databases are running perfectly on SQL 2000 AS at 1.7 GB of RAN with no issues. I want to be crystal-clear with my response. The objects are ALL now in BI studio (definitions of only dimensions and cubes) and the SQL 2K5 BI studio throws the SAME error whether I try to use the migration tool OR try to deploy using BI Studio. I need help or this is going to be a Microsoft 2K5 disaster. Is there some work around to data sizes coming from 2000 AS? I cannot be the ONLY person that has seen this complete stoppage. If you wish to continue on this most-excellent quest to actually see the realization of the intent of "Project-REAL" then please give me a ring at 512.845.4950 or reply to this so that we can overcome this very intense (and complete stoppage) road block. Please help. I do not mean to be forward and apologize. This project should have taken approximately 60-80 hours and it is now well into the hundreds of hours. I am merely a fellow MS-er just trying to overcome adversity and see this through to success for us all.

:)

|||

I will investigate this and will try to reproduce it on our side to debug and see if we can provide a work-around or a quick fix.

You migrated the AS2000 database with Migration Wizard ? If yes, can you run the wizard again to generate the migration script ? (there is an option on the first page to either migrate to AS2005 or generate the script). Once you have the script, please archive it as .zip and send it to me at adumitrascu at hotmail dot com.

Adrian Dumitrascu.

|||

I forgot to mention, please double check and remove sensitive meta-data in the AS2000 database and the migration script. Especially verify the connection strings for data sources (for passwords - just change the connection strings in the script to bogus values, I don't need them to be correct). Also, if you have MDX calculations that you want to keep private, please remove them from the script (search for "<Commands>" in the script to review them).

Adrian Dumitrascu.

|||

Adrian,

Thanks again for your help here. You have been great. We do have some issues with running out of memory that I will be posting. Here is the error:


Errors and Warnings from Response
Memory error: Allocation failure : Not enough storage is available to process this command. .
Memory error: Allocation failure : Not enough storage is available to process this command. .
Errors in the metadata manager. An error occurred when loading the AggregationDesign aggregation design, from the file, '\\?\E:\Program Files\Microsoft SQL Server\MSSQL.2\OLAP\Data\Business Report System ADW.1.db\TA_2003.0.cub\TA_2003.0.det\AggregationDesign.0.agg.xml'.
Error Code = 0x8007000E, External Code = 0x00000000:.

What can I do now about this? We seem to be hitting every roadblock that can be during this POC. Any assistance will be greatly appreciated.

krj

|||

Hi Kjr,

I have the same problem that you had before the last post.

What solution you had about this problem System.OutOfMemoryException with MigrationWizard? You can help me?

tks,

Eder F. Dias

|||

We are actually still working on the issue. The last recommendation was to move UP to the 64-BIT version of SSAS 2k5. As we continue to peice-meal the cubes into the 32-BIT solution there is an entire discussion about maintenace and care of the SSAS DB going forward. Given the application memory "ceiling" that you may be experiencing it may make sense to try the 64-bit version of SSAS 2k5. This would, as I understand it, help to allocate the required memory in a manner that would potentially allow thw application to run more freely.

krj

|||

Thanks Kjr for answer.

Actually my environment is:

- Intel Xeon CPu 3.4 GHZ, 3.93GB RAM

- Windows 2003 Enterprise x64 SP1

- SQL Server 2005 Enterprise x64 (SP1 - build 2153)

How you can see I working in a 64-Bit Environment, but the exception still happen. I don't know why. Maybe my environment isn't the best? What can i do for became it better?

Somethig strange is that the MigrationWizard.exe process work in a 32-Bit even in x64 enviroment. There is another application that work in 64-Bit?

More informations about my server:

OS Name Microsoft(R) Windows(R) Server 2003 Enterprise x64 Edition
Version 5.2.3790 Service Pack 1 Build 3790
Other OS Description Not Available
OS Manufacturer Microsoft Corporation
System Name SRVBRASIL31
System Manufacturer HP
System Model ProLiant DL380 G4
System Type x64-based PC
Processor EM64T Family 15 Model 4 Stepping 3 GenuineIntel ~3400 Mhz
Processor EM64T Family 15 Model 4 Stepping 3 GenuineIntel ~3400 Mhz
Processor EM64T Family 15 Model 4 Stepping 3 GenuineIntel ~3400 Mhz
Processor EM64T Family 15 Model 4 Stepping 3 GenuineIntel ~3400 Mhz
BIOS Version/Date HP P51, 10/10/2005
SMBIOS Version 2.3
Windows Directory C:\WINDOWS
System Directory C:\WINDOWS\system32
Boot Device \Device\HarddiskVolume1
Locale United States
Hardware Abstraction Layer Version = "5.2.3790.1830 (srv03_sp1_rtm.050324-1447)"
User Name Not Available
Time Zone E. South America Standard Time
Total Physical Memory 4,031.28 MB
Available Physical Memory 1.70 GB
Total Virtual Memory 5.69 GB
Available Virtual Memory 3.23 GB
Page File Space 2.00 GB
Page File C:\pagefile.sys

tks a lot

Eder F Dias

|||

Adrian,

You mentioned there is a fix for this coming in SP2. Is this fix available as a Hotfix now or only as a part of SP2?

- Steve

|||

Steve,

I had varying degrees of success and are too trying to keep things running and get them stable. What I had done to extend the envelope is to try give the "system" as much "resources" as I could. First, I had 8 GB of RAM to your 4 GB. This may be a critical point. Remember, in most cases WIndows needs a memory to run as an operating system. In my case on the 32-bit platform this was around 600/700 MB for just the OS. On the 64-bit I am not so sure of the base OS memory requirements. For SQL 2K5 though (on the 32-bit platform) it was recommneded to me to put in the /3GB switch in the boot.ini for SQL 2K5 to allow for higher memory addressing. Also there is an AWE switch in "spconfig" (properties of the server in SQL 2K5) to allow for this memory to be used.

The reason that I am going over this with you is that it seems you are having the EXACT behavior in the 64-bit version that I ma having in the 32-bit version. It seemed logical to assume that I was hitting the wall because of the 32-bit addressing limitations (as told to me by Microsoft). I will continue on the next post.

Exception of type 'System.OutOfMemoryException' was thrown. (Microsoft.SqlServer.SmoEnum)

I just received this message:

Exception of type 'System.OutOfMemoryException' was thrown. (Microsoft.SqlServer.SmoEnum)

I am completely dead in the water. Cannot connect to SSAS 2005. HELP!

hello,

it is not quite clear what are you trying to do when the exception happens. Is the system really running low on memory?

the comment above seems to suggest that you are trying to connect to SSAS, however the exception seems to come from SMO (?) which is an OM for working with SQL server (relational).

if the issue is that you are unable to connect to MSAS, could you please provide more information as to what are the sequence of actions you take (also how you connect -- from Management Studio or in some different way; if from Management Studio - what do you enter in connection dialog etc.), and more information about when the exception occures and stack trace and other info about the error (in management studio error msg usually has "Show Technical Details" button)?

thanks a lot,

|||

I am trying to do the initial load of SSAS 2005. Just loading the Microsoft SQL Server\90\Tools\Binn\VSShell\Common7\IDE\SqlWb.exe executable to get to the SSAS databases is just stopped. I had to "split" the SSAS databses to get them imported into three splits. The SSAS 2000 has 35 dimensions, 49 cubes, and 4 virtual FACT cubes. This was load #2 out of 3. I had just completed the cube builds on 31 out of 49 cubes. Had just completed 138 million rows of populating the SSAS 2k5 database. I had done tests that put the total row count to 240 to 280 million rows when completed. The size of the total SSAS DB is currently projected at 16-20 GB when completed. I tried to re-impliment the 2nd set of cubes and now SSAS 2k5 will not even start. I had tried the /PAE as well as the /3GB swtiches with no success. I tried each switch individually with full reboot each. No success. . Help.

|||

Update. I took the switch /3GB OUT of the boot.ini and was successful in getting the SS Management Studio to at least start. Now, as I try to open cubes to work with them in Management Studio, it gives me the following error:

Excetion of type 'System.OutOfMemoryException' was thrown. (Microsoft.SqlServer.SmoEnum)

Is there a 2GB memory limit in SSAS? When you put in the /3GB switch it actually fails to load initially. Is there some caching/hashing algorythm that allows the standard non /3GB switch to be successfule and the /3GB switch enablement to fail in high-memory usage cases? This is a bit perplexing. The 2000 OLAP database runs at about 1.7 GB TOTAL in production today (the source for the migration wizard), and it is stable. Using the Database Migration Wizard (9.0.2047.0) I could NOT get it to migrate the whole 2k OLAP database. It has the memory exception error. I had to "split" it up (Migtration Wizard) to get the the dimensions and cubes to load by chunking it into SSAS 2k5. I was successful on the first half of the 2k OLAP migration to SSAS 2k5 mmigrating 31 out of 49 cubes. I have subsequently loaded all of the dimensions (all 35 from above) with data and cubes (about 31 cubes with about 137 million rows total) with data. I tried to migrate the other outstanding 16 cubes and all errors started happening and SSAS 2k5 Management Studio is failing to even load. My hardware/OS config is as follows:

C drive - 10 GB (RAID 1)

D drive - 8 GB (4095 pagefile is here) (RAID 10)

E drive - 50GB (RAID 10)

RAM - 8GB

OS - Server 2003 Enterprise (Open License Version 5.2 Build 3790.srv_03_sp1_rtm.050324-1447: ServicePack 1)

DB - SQL Server 2005 Enterprise - SP1/041806 installed 042006 - Full install of all components. I have stopped all SQL Services accept SQL Browser and SSAS services.

Does anyone else have this issue with SSAS 2005 and hitting the ceiling-memory limit?

How can 2000 OLAP run at 1.7 GB and be stable and SSAS 2k5 fails to even load the exact same data using Microsoft's own toolset?

I am seeking to find the solution to this issue and are on an extremely aggreesive schedule. HELP!

|||

Okay,

I have tried something to see if it will work and here is where I am at. I cleared the transferred SSAS DB from Management Studio completely. I re-migrated (in three MigratioWizard.exe sessions) most (32 dimensions and 38 cubes) of the existing objects into a new "stand-alone" SSAS 2K5 database. I then "imported" the now almost fully deployed (all that is left are the FACT tables) SSAS 2K5 into BI Studio successfully using the import funtion in BI Studio! I then backed up and deleted the SSAS 2K5 DB from Management Studio. Now the bad news. I was able to get the BI project to "build" with no errors. Now, I am back to the SAME Exception of type

'System.OutOfMemoryException' was thrown

error I had when only "trying" to partially migrate and populate even 50% of the cubes in Management Studio (see previous postings). Now, this same error shows up under the errors bottom-left dialog box in BI studio with no other explanantions. I have been combing the internet trying to get a solution. Can anyone out there help?

|||

We have a defect in RTM and SP1 builds, that OutOfMemoryException occurs when SQL Management Studio connects to AS2005 with large database(s). We are fixing this for SP2.

Meanwhile, the work-around would be to use Visual Studio to connect to AS2005 to administer or process objects.

Or to temporary separate databases on different AS2005 instances (then script each of them and run the scripts to combine them back on a single AS2005 instance).

Adrian Dumitrascu.

|||

The deimensions and cubes are now in ONE instance of BI Studio 2005. There are no more SQL or ANY databases that are in this box or connected to this instance. I am running a DELL 2850 with DUAL 3.8 Ghz procs, 8GB or RAM, and ALL data partitions that SQL 2K5 touches are RAID 10/15k hard drives. This should be "flying" on 2K5 from all of the press. These databases are running perfectly on SQL 2000 AS at 1.7 GB of RAN with no issues. I want to be crystal-clear with my response. The objects are ALL now in BI studio (definitions of only dimensions and cubes) and the SQL 2K5 BI studio throws the SAME error whether I try to use the migration tool OR try to deploy using BI Studio. I need help or this is going to be a Microsoft 2K5 disaster. Is there some work around to data sizes coming from 2000 AS? I cannot be the ONLY person that has seen this complete stoppage. If you wish to continue on this most-excellent quest to actually see the realization of the intent of "Project-REAL" then please give me a ring at 512.845.4950 or reply to this so that we can overcome this very intense (and complete stoppage) road block. Please help. I do not mean to be forward and apologize. This project should have taken approximately 60-80 hours and it is now well into the hundreds of hours. I am merely a fellow MS-er just trying to overcome adversity and see this through to success for us all.

:)

|||

I will investigate this and will try to reproduce it on our side to debug and see if we can provide a work-around or a quick fix.

You migrated the AS2000 database with Migration Wizard ? If yes, can you run the wizard again to generate the migration script ? (there is an option on the first page to either migrate to AS2005 or generate the script). Once you have the script, please archive it as .zip and send it to me at adumitrascu at hotmail dot com.

Adrian Dumitrascu.

|||

I forgot to mention, please double check and remove sensitive meta-data in the AS2000 database and the migration script. Especially verify the connection strings for data sources (for passwords - just change the connection strings in the script to bogus values, I don't need them to be correct). Also, if you have MDX calculations that you want to keep private, please remove them from the script (search for "<Commands>" in the script to review them).

Adrian Dumitrascu.

|||

Adrian,

Thanks again for your help here. You have been great. We do have some issues with running out of memory that I will be posting. Here is the error:


Errors and Warnings from Response
Memory error: Allocation failure : Not enough storage is available to process this command. .
Memory error: Allocation failure : Not enough storage is available to process this command. .
Errors in the metadata manager. An error occurred when loading the AggregationDesign aggregation design, from the file, '\\?\E:\Program Files\Microsoft SQL Server\MSSQL.2\OLAP\Data\Business Report System ADW.1.db\TA_2003.0.cub\TA_2003.0.det\AggregationDesign.0.agg.xml'.
Error Code = 0x8007000E, External Code = 0x00000000:.

What can I do now about this? We seem to be hitting every roadblock that can be during this POC. Any assistance will be greatly appreciated.

krj

|||

Hi Kjr,

I have the same problem that you had before the last post.

What solution you had about this problem System.OutOfMemoryException with MigrationWizard? You can help me?

tks,

Eder F. Dias

|||

We are actually still working on the issue. The last recommendation was to move UP to the 64-BIT version of SSAS 2k5. As we continue to peice-meal the cubes into the 32-BIT solution there is an entire discussion about maintenace and care of the SSAS DB going forward. Given the application memory "ceiling" that you may be experiencing it may make sense to try the 64-bit version of SSAS 2k5. This would, as I understand it, help to allocate the required memory in a manner that would potentially allow thw application to run more freely.

krj

|||

Thanks Kjr for answer.

Actually my environment is:

- Intel Xeon CPu 3.4 GHZ, 3.93GB RAM

- Windows 2003 Enterprise x64 SP1

- SQL Server 2005 Enterprise x64 (SP1 - build 2153)

How you can see I working in a 64-Bit Environment, but the exception still happen. I don't know why. Maybe my environment isn't the best? What can i do for became it better?

Somethig strange is that the MigrationWizard.exe process work in a 32-Bit even in x64 enviroment. There is another application that work in 64-Bit?

More informations about my server:

OS Name Microsoft(R) Windows(R) Server 2003 Enterprise x64 Edition
Version 5.2.3790 Service Pack 1 Build 3790
Other OS Description Not Available
OS Manufacturer Microsoft Corporation
System Name SRVBRASIL31
System Manufacturer HP
System Model ProLiant DL380 G4
System Type x64-based PC
Processor EM64T Family 15 Model 4 Stepping 3 GenuineIntel ~3400 Mhz
Processor EM64T Family 15 Model 4 Stepping 3 GenuineIntel ~3400 Mhz
Processor EM64T Family 15 Model 4 Stepping 3 GenuineIntel ~3400 Mhz
Processor EM64T Family 15 Model 4 Stepping 3 GenuineIntel ~3400 Mhz
BIOS Version/Date HP P51, 10/10/2005
SMBIOS Version 2.3
Windows Directory C:\WINDOWS
System Directory C:\WINDOWS\system32
Boot Device \Device\HarddiskVolume1
Locale United States
Hardware Abstraction Layer Version = "5.2.3790.1830 (srv03_sp1_rtm.050324-1447)"
User Name Not Available
Time Zone E. South America Standard Time
Total Physical Memory 4,031.28 MB
Available Physical Memory 1.70 GB
Total Virtual Memory 5.69 GB
Available Virtual Memory 3.23 GB
Page File Space 2.00 GB
Page File C:\pagefile.sys

tks a lot

Eder F Dias

|||

Adrian,

You mentioned there is a fix for this coming in SP2. Is this fix available as a Hotfix now or only as a part of SP2?

- Steve

|||

Steve,

I had varying degrees of success and are too trying to keep things running and get them stable. What I had done to extend the envelope is to try give the "system" as much "resources" as I could. First, I had 8 GB of RAM to your 4 GB. This may be a critical point. Remember, in most cases WIndows needs a memory to run as an operating system. In my case on the 32-bit platform this was around 600/700 MB for just the OS. On the 64-bit I am not so sure of the base OS memory requirements. For SQL 2K5 though (on the 32-bit platform) it was recommneded to me to put in the /3GB switch in the boot.ini for SQL 2K5 to allow for higher memory addressing. Also there is an AWE switch in "spconfig" (properties of the server in SQL 2K5) to allow for this memory to be used.

The reason that I am going over this with you is that it seems you are having the EXACT behavior in the 64-bit version that I ma having in the 32-bit version. It seemed logical to assume that I was hitting the wall because of the 32-bit addressing limitations (as told to me by Microsoft). I will continue on the next post.

Exception of type System.OutOfMemoryException was thrown

Hi,
Once in a while I get this error when working in reporting services
I am unable to save the changes made if I get this error
I increased PF Usage Memory size, but that does not work
can someone help me with this
Thanks
PonnurangamHi Ponnurangam,
I have seen the same issue. I believe that the MS Development Environment
for Reporting Services has a memory leak. I was cutting and pasting items
and watched the resident memory of the program grow by over 100 megs.
I work around this by occasionally shutting down the MSDE and coming back in.
Not sure if a fix is in the works. Hopefully so!
"Ponnurangam" wrote:
> Hi,
> Once in a while I get this error when working in reporting services
> I am unable to save the changes made if I get this error
> I increased PF Usage Memory size, but that does not work
> can someone help me with this
> Thanks
> Ponnurangam
>
>

Exception of type Microsoft.ReportingServices.ReportProcessing.bk

Hi,
I am receiving the following error for a new data driven subscription I have
setup.
Exception of type Microsoft.ReportingServices.ReportProcessing.bk
I have searched this newsgroup and google and can't seem to find any
description of the error. The RS logs also yield very little other than the
exception itself.
When running the subscription with a "Run Once" schedule it doesn't seem to
throw the error; if I schedule the report to run on a daily schedule it does
throw the error.
Any help is greatly appreciated. If more details are needed please let me
know.
Thanks,
Nickwhat's the delivery mechanism?
make sure the account with which RS runs for unattended execution has
permissions to access the data store being used by your subscription. The
default account for RS from the setup is the Network Service account in
Win2003 Server.
Is this RS 2000 or 2005?
--
Regards,
Thiago Silva
"Nick @. INDATA" wrote:
> Hi,
> I am receiving the following error for a new data driven subscription I have
> setup.
> Exception of type Microsoft.ReportingServices.ReportProcessing.bk
> I have searched this newsgroup and google and can't seem to find any
> description of the error. The RS logs also yield very little other than the
> exception itself.
> When running the subscription with a "Run Once" schedule it doesn't seem to
> throw the error; if I schedule the report to run on a daily schedule it does
> throw the error.
> Any help is greatly appreciated. If more details are needed please let me
> know.
> Thanks,
> Nick|||This is SQL RS 2000, I will double check the permissions for the unattended
execution account. I also have a case open with MSFT on this right now, will
share anything I find out from them here.
"HC" wrote:
> what's the delivery mechanism?
> make sure the account with which RS runs for unattended execution has
> permissions to access the data store being used by your subscription. The
> default account for RS from the setup is the Network Service account in
> Win2003 Server.
> Is this RS 2000 or 2005?
> --
> Regards,
> Thiago Silva
> "Nick @. INDATA" wrote:
> > Hi,
> >
> > I am receiving the following error for a new data driven subscription I have
> > setup.
> >
> > Exception of type Microsoft.ReportingServices.ReportProcessing.bk
> >
> > I have searched this newsgroup and google and can't seem to find any
> > description of the error. The RS logs also yield very little other than the
> > exception itself.
> >
> > When running the subscription with a "Run Once" schedule it doesn't seem to
> > throw the error; if I schedule the report to run on a daily schedule it does
> > throw the error.
> >
> > Any help is greatly appreciated. If more details are needed please let me
> > know.
> >
> > Thanks,
> >
> > Nick|||FileShare Delivery Extension
"HC" wrote:
> what's the delivery mechanism?
> make sure the account with which RS runs for unattended execution has
> permissions to access the data store being used by your subscription. The
> default account for RS from the setup is the Network Service account in
> Win2003 Server.
> Is this RS 2000 or 2005?
> --
> Regards,
> Thiago Silva
> "Nick @. INDATA" wrote:
> > Hi,
> >
> > I am receiving the following error for a new data driven subscription I have
> > setup.
> >
> > Exception of type Microsoft.ReportingServices.ReportProcessing.bk
> >
> > I have searched this newsgroup and google and can't seem to find any
> > description of the error. The RS logs also yield very little other than the
> > exception itself.
> >
> > When running the subscription with a "Run Once" schedule it doesn't seem to
> > throw the error; if I schedule the report to run on a daily schedule it does
> > throw the error.
> >
> > Any help is greatly appreciated. If more details are needed please let me
> > know.
> >
> > Thanks,
> >
> > Nick|||Hey, Nick
Did you ever get a resolution for this issue?
I am not quite sure what the error means, since I've never seen an
exception type "ReportProcessing.bk."
Can you look at your RS logs to get more information on the issue. The
RS log is located in "C:\Program Files\Microsoft SQL
Server\MSSQL.x\Reporting Services\LogFiles".
Take a look at the log file for the time when you encountered the
issue. Try recreating the problem first, so you can have a fresh
logged entry, then look it up on the log.
Also, take a peek at the event log on the server, for anything
suspicious.
If you can post the message from the log, that may help in
troubleshooting that issue.
regards,
Thiago Silva

Sunday, February 26, 2012

Exception Error then Crashes

hello
during working on the report layout the tool shows the
following error then it crashes;
EXCEPTION OF TYPE SYSTEM.OUTOFMEMORYEXCEPTION WAS THROWN.
please advice
Regards
Ahmad Al-khatib
technical support engineerCould you please send detailed steps to reproduce the issue? Also please
include the version information of OS, VS, and Reporting Services.
--
Ravi Mumulla (Microsoft)
SQL Server Reporting Services
This posting is provided "AS IS" with no warranties, and confers no rights.
<anonymous@.discussions.microsoft.com> wrote in message
news:2089301c459b4$0acfea80$a301280a@.phx.gbl...
> hello
> during working on the report layout the tool shows the
> following error then it crashes;
> EXCEPTION OF TYPE SYSTEM.OUTOFMEMORYEXCEPTION WAS THROWN.
> please advice
> Regards
> Ahmad Al-khatib
> technical support engineer
>

Friday, February 24, 2012

excell export doesn't work

I have a report with multiple tables, when I trie to export this report to
excell
it reporting services returns an error message:
Exception of type
Microsoft.ReportingServices.ReportRendering.ReportRenderingException was
thrown. (rrRenderingError) Get Online Help
Exception of type
Microsoft.ReportingServices.ReportRendering.ReportRenderingException was
thrown.
Index was out of range. Must be non-negative and less than the size of the
collection. Parameter name: index
can anybody help me solve this?"Dirkjan van Groeningen" <DirkjanvanGroeningen@.discussions.microsoft.com>
wrote in message news:F60A2429-8A44-4DE7-B162-D29FB36091E2@.microsoft.com...
>I have a report with multiple tables, when I trie to export this report to
> excell
> it reporting services returns an error message:
>
> Exception of type
> Microsoft.ReportingServices.ReportRendering.ReportRenderingException was
> thrown. (rrRenderingError) Get Online Help
> Exception of type
> Microsoft.ReportingServices.ReportRendering.ReportRenderingException was
> thrown.
> Index was out of range. Must be non-negative and less than the size of the
> collection. Parameter name: index
> can anybody help me solve this?
Hi, I have the same problem. I have post this issue here 3 times with no
meaningful help.
Please post back if you find a solution.
Good luck,
Bryan

Sunday, February 19, 2012

excel source numeric/string data problem

i've been reading some problems with excel source data being force as

numeric type when there are string/numeric type in the data, but adding

IMEX=1 to the extended properties will fix this problem...this is true

but not in my case...

say my excel file have about 40 rows..if row 1-39 in column A are all

NULL and row 40 has a string in it, the string in row 40 will not be

converted and the excel source is forcing this column A data type to be

numeric..having IMEX=1 in there does not work..however..if i add a

string anywhere in row 1-8 in column A, the the string in row 40 will

be converted because the external data type now is a string..

anyone know how to solve this problem?solved this problem by going into the excel file and format first first 8 rows as text|||

An alternative is to change the datatype for the column using the "Show Advanced Editor" menu on the Excel Data Source. Select the "Input and Output Properties" tab, expand the Excel Source Output and the the "Output Columns" (Note: not the External Columns). Now you can select change the DataType property of some \ all columns.

Excel Source from Variables and data type Issue

Hi

I am after some help with the following issue

I have a package that reads a table for a file path of a excel file. This gets passed to a variable and then this file is imported into a staging table for further transformation work. The issue i have is that file 1 may contain data in Column A which is 50 characters long in which case i have to import the excel as a DT_WSTR, do a data conversion to a DT_STR and load to the staging table. However file 2 may contain data in Column which is over 255 characters in which case it would import as a DT_NTEXT which i then transform to a DT_TEXT and then to a DT_STR. I used a fixed file path in the Excel Connection to start with which was for File 1 so the datatype for column 1 is a DT_WSTR. I then changed the excel connection to a filepath variable, put the path of file 2 in my table and called it from my package. It failed as the data exceeding 255 characters in column 1 needed to be a DT_NTEXT. I can change it and it works but if i then run the package using file 1 (less than 255 characters) it fails again as it wants it to be a DT_WSTR.

Is there anyway around this? Am i missing something as i would have thought that by setting it to DT_NTEXT this would cover data under 255 characters as well.

Any help is gratefully aprreciated.

DT_WSTR and DTNTEXT are different types, so SSIS detects this as a metadata change. It's based on what the Excel provider reports back to SSIS.

Someone ran into to this a few months ago, and I think they ended up having to use two different data flows, one for DT_WSTR and one using DT_NTEXT.

|||

hi

Thanks for the reply.

I have it kind of working using your suggestion here. I now have 2 data flow tasks, the first is for the troublesome column as a DT_WSTR (the most common for the files imported) and an on fail constraint to a NTEXT. In theory it works fine however because the first component fails when it is an NTEXT the whole package fails because it is within a foreach loop.

as far as I can see the only solution would be to either increae the Maximum number of errors on the foreach loop which will cause a problem if something else outside of these 2 components fails or to change the error handling on the dataflow task. I cannot ignore the error on truncate in the dataflow task as it will suceed but will import only the first 255 characters using the DT_WSTR flow. If I cange the onerror erro output it still appears to record the error against the error count so the max error count is reached.

Any suggestions would be greatly received

Thanks again

|||

Could you use an Execute SQL or Script task to query the metadata for the Excel sheet ahead of time, and redirect to the appropriate task? This link has more information on using GetOleDbSchemaTable to get this type of information.

http://support.microsoft.com/kb/309488

Excel Source from Variables and data type Issue

Hi

I am after some help with the following issue

I have a package that reads a table for a file path of a excel file. This gets passed to a variable and then this file is imported into a staging table for further transformation work. The issue i have is that file 1 may contain data in Column A which is 50 characters long in which case i have to import the excel as a DT_WSTR, do a data conversion to a DT_STR and load to the staging table. However file 2 may contain data in Column which is over 255 characters in which case it would import as a DT_NTEXT which i then transform to a DT_TEXT and then to a DT_STR. I used a fixed file path in the Excel Connection to start with which was for File 1 so the datatype for column 1 is a DT_WSTR. I then changed the excel connection to a filepath variable, put the path of file 2 in my table and called it from my package. It failed as the data exceeding 255 characters in column 1 needed to be a DT_NTEXT. I can change it and it works but if i then run the package using file 1 (less than 255 characters) it fails again as it wants it to be a DT_WSTR.

Is there anyway around this? Am i missing something as i would have thought that by setting it to DT_NTEXT this would cover data under 255 characters as well.

Any help is gratefully aprreciated.

DT_WSTR and DTNTEXT are different types, so SSIS detects this as a metadata change. It's based on what the Excel provider reports back to SSIS.

Someone ran into to this a few months ago, and I think they ended up having to use two different data flows, one for DT_WSTR and one using DT_NTEXT.

|||

hi

Thanks for the reply.

I have it kind of working using your suggestion here. I now have 2 data flow tasks, the first is for the troublesome column as a DT_WSTR (the most common for the files imported) and an on fail constraint to a NTEXT. In theory it works fine however because the first component fails when it is an NTEXT the whole package fails because it is within a foreach loop.

as far as I can see the only solution would be to either increae the Maximum number of errors on the foreach loop which will cause a problem if something else outside of these 2 components fails or to change the error handling on the dataflow task. I cannot ignore the error on truncate in the dataflow task as it will suceed but will import only the first 255 characters using the DT_WSTR flow. If I cange the onerror erro output it still appears to record the error against the error count so the max error count is reached.

Any suggestions would be greatly received

Thanks again

|||

Could you use an Execute SQL or Script task to query the metadata for the Excel sheet ahead of time, and redirect to the appropriate task? This link has more information on using GetOleDbSchemaTable to get this type of information.

http://support.microsoft.com/kb/309488

Excel Source dt_ntext problems

Hi:

I import data from multiple excel files into SQL DB. I have trouble with fields that could contain >255 chars.

If I have the col type = DT_Ntext in my Data Flow, the package fails for files that do not have any values >255 chars.

If I have the external coltype=dt_wstr and the output coltype=dt_wstr(4000) the package fails if the file contains any value >255 chars.(Implicit conversion does not occur, as expected).

I worked around by adding a dummy first row with >255 chars.

Is there a way to use a cast function to solve this prob? I tried using Select dt_ntext(fieldname) from Sheet1$, but that does not work.

Is there some clean way to get around this problem?

TIA
Kar

Hi Karfast

For now this have to be done manually. I was able successfully execute getting advise from Bob Bojanic

Follw these instructions carefully:

1) Go to Mapping option at Destination, Point your mouse for each of row having your above problem and its corresponding destination (note the Datatypes types and Lengths)

2) Open Advanced Editor for the Excel Source, Under Amend the Types and Lengths accordingly for all the columns in destination to match your source Columns' type and length

Thanks

Subhash Subramanyam

|||

Metadata of Excel sheets is often hard to deal with. The dummy first row seems as a clever solution to me. It is not elegant but the entire JET provider is far from that.

Select dt_ntext(fieldname) is not going to work as dt_ntext can be recognized only by SSIS and not by the JET provider. The more adequate query would be select LongText(fieldname), but I have no idea if JET can deal with it (don't have a way to try it at the moment).

HTH.

|||

Thanks, Bob.

I tried:

Select LongText(Fieldname) , and also tried out Memo etc. All these give a Undefined Function error.

I cant even find any help on this. There is apparently something called a Jet SQL Reference, but I couldnt find it. Not even in Access 2000 Local Help.

So I guess the dirty workaround is the only way :-(

Kar