Showing posts with label desktop. Show all posts
Showing posts with label desktop. Show all posts

Wednesday, March 7, 2012

Exception thrown: database file cannot be found

Hi,

I'm developing a desktop C# app that uses SQL Everywhere as an embedded database.
I generated strongly typed DataSet and use that to populate a DataGrid on my app.

When the app first loads, it populates the DataGrid with a line like this:

this.sTORE_INV_LNTableAdapter.Fill(this.inventoriesDataSet.STORE_INV_LN);

That all works fine. Later on, after adding more data to the database (through reading a csv file), I wanted to refresh the display on the DataGrid.

I used the same line of code:

this.sTORE_INV_LNTableAdapter.Fill(this.inventoriesDataSet.STORE_INV_LN);

however, this time, the following exception was thrown:

The database file cannot be found. Check the path to the database. [ File name = .\\Inventories.sdf ]

Does anyone know what may be going on? I saw this article about a bug in VS 2005 when using strongly typed DataSets (http://channel9.msdn.com/wiki/default.aspx/MobileDeveloper.DatabaseCannotBeFoundErrorInTypedDataset)
but that doesn't seem to apply here.

The connection string is identical both times that line of code is called so I'm a bit baffled with what's going on.

Any help would be appreciated. Thanks,

Jose

Windows CE does not support relative paths you're trying to use. You must specify correct absolute path to the database located on device file system.

Also keep in mind Windows CE does not have drive letters and can't see your desktop's C: (D:, etc) drive as many developers seem to believe.

|||I'm not running this on Windows CE. I'm trying out the new SQL Server Everywhere and using it on Windows XP.

Thanks though.|||

If file can not be found that's probably because it can't be found. You can use File.Exists() to verify that.

|||Thanks for the input!!

It sounds so simple and yet I hadn't thought about that. I kept looking for the complicated answer.

It turns out, my database file was there all along. The problem was, after opening and reading the CSV files to import into the db, the next time I tried to access the db the app was looking for the db file in the same directory where my CSVs where...and of course, it wasn't finding it.

So now, after reading a CSV and prior to re-querying the db, I use Directory.SetCurrentDirectory() to reset where the app looks for its db.

Thanks for the help, I had been stumped by this for a week.

-Jose|||

this one has had me scratching my head for a day so I'm glad I haven't spent a week...

I'm using a separate project as a class library with a *.sdf database so it can be re-used for several projects.

I had the exact problem, everything was good until I opened OpenFileDialog(), then things went south after that.

Just a note to your solution... (maybe you do this already)...but you can assign the OpenFileDialog.RestoreDirectory flag to true and then after it closes the original directory will be restored prior to the open dialog....that way you don't need the Directory.SetCurrentDirectory.

fileChooser = new OpenFileDialog();

fileChooser.RestoreDirectory = true;

gl

Exception thrown: database file cannot be found

Hi,

I'm developing a desktop C# app that uses SQL Everywhere as an embedded database.
I generated strongly typed DataSet and use that to populate a DataGrid on my app.

When the app first loads, it populates the DataGrid with a line like this:

this.sTORE_INV_LNTableAdapter.Fill(this.inventoriesDataSet.STORE_INV_LN);

That all works fine. Later on, after adding more data to the database (through reading a csv file), I wanted to refresh the display on the DataGrid.

I used the same line of code:

this.sTORE_INV_LNTableAdapter.Fill(this.inventoriesDataSet.STORE_INV_LN);

however, this time, the following exception was thrown:

The database file cannot be found. Check the path to the database. [ File name = .\\Inventories.sdf ]

Does anyone know what may be going on? I saw this article about a bug in VS 2005 when using strongly typed DataSets (http://channel9.msdn.com/wiki/default.aspx/MobileDeveloper.DatabaseCannotBeFoundErrorInTypedDataset)
but that doesn't seem to apply here.

The connection string is identical both times that line of code is called so I'm a bit baffled with what's going on.

Any help would be appreciated. Thanks,

Jose

Windows CE does not support relative paths you're trying to use. You must specify correct absolute path to the database located on device file system.

Also keep in mind Windows CE does not have drive letters and can't see your desktop's C: (D:, etc) drive as many developers seem to believe.

|||I'm not running this on Windows CE. I'm trying out the new SQL Server Everywhere and using it on Windows XP.

Thanks though.

|||

If file can not be found that's probably because it can't be found. You can use File.Exists() to verify that.

|||Thanks for the input!!

It sounds so simple and yet I hadn't thought about that. I kept looking for the complicated answer.

It turns out, my database file was there all along. The problem was, after opening and reading the CSV files to import into the db, the next time I tried to access the db the app was looking for the db file in the same directory where my CSVs where...and of course, it wasn't finding it.

So now, after reading a CSV and prior to re-querying the db, I use Directory.SetCurrentDirectory() to reset where the app looks for its db.

Thanks for the help, I had been stumped by this for a week.

-Jose
|||

this one has had me scratching my head for a day so I'm glad I haven't spent a week...

I'm using a separate project as a class library with a *.sdf database so it can be re-used for several projects.

I had the exact problem, everything was good until I opened OpenFileDialog(), then things went south after that.

Just a note to your solution... (maybe you do this already)...but you can assign the OpenFileDialog.RestoreDirectory flag to true and then after it closes the original directory will be restored prior to the open dialog....that way you don't need the Directory.SetCurrentDirectory.

fileChooser = new OpenFileDialog();

fileChooser.RestoreDirectory = true;

gl

Friday, February 24, 2012

Excel: terrible performance

Running SQLS7 on an NT4 server at 500MHz, with 768MB RAM (256MB
dedicated to SQLS). Using ODBC to get data into an Excel spreadsheet on
a desktop machine. SQL database is about 3GB; spreadsheet comes to about
14MB. Data from 3 tables is being used, related by a field common to all
3; data for one month out of 4 years' total data being extracted.
Performance is extremely slow, 15 minutes or more after selecting Edit
Query on the spreadsheet before the MS Query window comes up, etc.
Performance was terrible on 300MHz/Win98/Office 97 workstation; it is
still unusable on 2600MHz/WinXP/Office 2000. When performing the Edit
Query, the windows Task Manager shows workstation CPU usage to be 100%
steady. The server machine is not doing anything else; the Task Manager
shows CPU usage during SQL processing to be moderate.
Is this poor performance to be expected with the amount of data and
spreadsheet size? Is there an interface that will give better
performance than ODBC? Will some other software perform better than
Excel?
Best wishes,
Michael Salem
Hi
Have you thought about using DTS to write the file to a share?
You should also check that you don't have logging enabled on the ODBC
connection.
If you execute the SQL in Query analyser you may be able to improve the
performance using the query plan.
John
"Michael Salem" <msnews@.ms3.org.uk> wrote in message
news:MPG.1b43a289db966af898968e@.msnews.microsoft.c om...
> Running SQLS7 on an NT4 server at 500MHz, with 768MB RAM (256MB
> dedicated to SQLS). Using ODBC to get data into an Excel spreadsheet on
> a desktop machine. SQL database is about 3GB; spreadsheet comes to about
> 14MB. Data from 3 tables is being used, related by a field common to all
> 3; data for one month out of 4 years' total data being extracted.
> Performance is extremely slow, 15 minutes or more after selecting Edit
> Query on the spreadsheet before the MS Query window comes up, etc.
> Performance was terrible on 300MHz/Win98/Office 97 workstation; it is
> still unusable on 2600MHz/WinXP/Office 2000. When performing the Edit
> Query, the windows Task Manager shows workstation CPU usage to be 100%
> steady. The server machine is not doing anything else; the Task Manager
> shows CPU usage during SQL processing to be moderate.
> Is this poor performance to be expected with the amount of data and
> spreadsheet size? Is there an interface that will give better
> performance than ODBC? Will some other software perform better than
> Excel?
> Best wishes,
> --
> Michael Salem
|||John Bell responded to my question on slow Excel/ODBC/SQL with 3GB SQLS
& 14MB Excel files -- many thanks.

> Have you thought about using DTS to write the file to a share?
> You should also check that you don't have logging enabled on the ODBC
> connection.
> If you execute the SQL in Query analyser you may be able to improve the
> performance using the query plan.
Thanks for these suggestions, I will follow up. I wouldn't expect Query
Analyzer to help, as it is a very simple query, but I will try it.
Reading between the lines it would appear that you're not totally
surprised by the slowness, so it is probably better to seek a more
efficient way of doing the analysis needed than to tweak the present
setup.
I've since learned that a very similar setup on the same hardware but
with a much smaller database works at an acceptable speed.
Best wishes,
michael Salem
|||Hi
Check out DTS as this is a more common way of doing it. See Books online and
http://www.sqldts.com/default.aspx for information regarding how to use
this.
John
"Michael Salem" <msnews@.ms3.org.uk> wrote in message
news:MPG.1b44d4de9dc5ffee98968f@.msnews.microsoft.c om...
> John Bell responded to my question on slow Excel/ODBC/SQL with 3GB SQLS
> & 14MB Excel files -- many thanks.
>
> Thanks for these suggestions, I will follow up. I wouldn't expect Query
> Analyzer to help, as it is a very simple query, but I will try it.
> Reading between the lines it would appear that you're not totally
> surprised by the slowness, so it is probably better to seek a more
> efficient way of doing the analysis needed than to tweak the present
> setup.
> I've since learned that a very similar setup on the same hardware but
> with a much smaller database works at an acceptable speed.
> Best wishes,
> --
> michael Salem
|||I asked about slowness getting data from SQL to Excel via ODBC; John
Bell made some excellent suggestions, for which many thanks. I append
the most recent message in full for reference, as it was a few days ago.
I was focussing on getting data out of a database which was somebody
else's responsibility; I didn't want to tread on toes. Anyway, after a
bit of analysis I added an index to the database anyway; this made a
dramatic difference. If I had realised that this was the problem, I
would have done it long ago.
Thanks again,
Michael Salem
John Bell wrote:
> Hi
> Check out DTS as this is a more common way of doing it. See Books online and
> http://www.sqldts.com/default.aspx for information regarding how to use
> this.
> John
> "Michael Salem" <msnews@.ms3.org.uk> wrote in message
> news:MPG.1b44d4de9dc5ffee98968f@.msnews.microsoft.c om...
>
>
|||I asked about slowness getting data from SQL to Excel via ODBC; John
Bell made some excellent suggestions, for which many thanks. I append
the most recent message in full for reference, as it was a few days ago.
I was focussing on getting data out of a database which was somebody
else's responsibility; I didn't want to tread on toes. Anyway, after a
bit of analysis I added an index to the database anyway; this made a
dramatic difference. If I had realised that this was the problem, I
would have done it long ago.
Thanks again,
Michael Salem
John Bell wrote:
> Hi
> Check out DTS as this is a more common way of doing it. See Books online and
> http://www.sqldts.com/default.aspx for information regarding how to use
> this.
> John
> "Michael Salem" <msnews@.ms3.org.uk> wrote in message
> news:MPG.1b44d4de9dc5ffee98968f@.msnews.microsoft.c om...
>
>

Excel: terrible performance

Running SQLS7 on an NT4 server at 500MHz, with 768MB RAM (256MB
dedicated to SQLS). Using ODBC to get data into an Excel spreadsheet on
a desktop machine. SQL database is about 3GB; spreadsheet comes to about
14MB. Data from 3 tables is being used, related by a field common to all
3; data for one month out of 4 years' total data being extracted.
Performance is extremely slow, 15 minutes or more after selecting Edit
Query on the spreadsheet before the MS Query window comes up, etc.
Performance was terrible on 300MHz/Win98/Office 97 workstation; it is
still unusable on 2600MHz/WinXP/Office 2000. When performing the Edit
Query, the windows Task Manager shows workstation CPU usage to be 100%
steady. The server machine is not doing anything else; the Task Manager
shows CPU usage during SQL processing to be moderate.
Is this poor performance to be expected with the amount of data and
spreadsheet size? Is there an interface that will give better
performance than ODBC? Will some other software perform better than
Excel?
Best wishes,
--
Michael SalemHi
Have you thought about using DTS to write the file to a share?
You should also check that you don't have logging enabled on the ODBC
connection.
If you execute the SQL in Query analyser you may be able to improve the
performance using the query plan.
John
"Michael Salem" <msnews@.ms3.org.uk> wrote in message
news:MPG.1b43a289db966af898968e@.msnews.microsoft.com...
> Running SQLS7 on an NT4 server at 500MHz, with 768MB RAM (256MB
> dedicated to SQLS). Using ODBC to get data into an Excel spreadsheet on
> a desktop machine. SQL database is about 3GB; spreadsheet comes to about
> 14MB. Data from 3 tables is being used, related by a field common to all
> 3; data for one month out of 4 years' total data being extracted.
> Performance is extremely slow, 15 minutes or more after selecting Edit
> Query on the spreadsheet before the MS Query window comes up, etc.
> Performance was terrible on 300MHz/Win98/Office 97 workstation; it is
> still unusable on 2600MHz/WinXP/Office 2000. When performing the Edit
> Query, the windows Task Manager shows workstation CPU usage to be 100%
> steady. The server machine is not doing anything else; the Task Manager
> shows CPU usage during SQL processing to be moderate.
> Is this poor performance to be expected with the amount of data and
> spreadsheet size? Is there an interface that will give better
> performance than ODBC? Will some other software perform better than
> Excel?
> Best wishes,
> --
> Michael Salem|||John Bell responded to my question on slow Excel/ODBC/SQL with 3GB SQLS
& 14MB Excel files -- many thanks.

> Have you thought about using DTS to write the file to a share?
> You should also check that you don't have logging enabled on the ODBC
> connection.
> If you execute the SQL in Query analyser you may be able to improve the
> performance using the query plan.
Thanks for these suggestions, I will follow up. I wouldn't expect Query
Analyzer to help, as it is a very simple query, but I will try it.
Reading between the lines it would appear that you're not totally
surprised by the slowness, so it is probably better to seek a more
efficient way of doing the analysis needed than to tweak the present
setup.
I've since learned that a very similar setup on the same hardware but
with a much smaller database works at an acceptable speed.
Best wishes,
--
michael Salem|||Hi
Check out DTS as this is a more common way of doing it. See Books online and
http://www.sqldts.com/default.aspx for information regarding how to use
this.
John
"Michael Salem" <msnews@.ms3.org.uk> wrote in message
news:MPG.1b44d4de9dc5ffee98968f@.msnews.microsoft.com...
> John Bell responded to my question on slow Excel/ODBC/SQL with 3GB SQLS
> & 14MB Excel files -- many thanks.
>
> Thanks for these suggestions, I will follow up. I wouldn't expect Query
> Analyzer to help, as it is a very simple query, but I will try it.
> Reading between the lines it would appear that you're not totally
> surprised by the slowness, so it is probably better to seek a more
> efficient way of doing the analysis needed than to tweak the present
> setup.
> I've since learned that a very similar setup on the same hardware but
> with a much smaller database works at an acceptable speed.
> Best wishes,
> --
> michael Salem|||I asked about slowness getting data from SQL to Excel via ODBC; John
Bell made some excellent suggestions, for which many thanks. I append
the most recent message in full for reference, as it was a few days ago.
I was focussing on getting data out of a database which was somebody
else's responsibility; I didn't want to tread on toes. Anyway, after a
bit of analysis I added an index to the database anyway; this made a
dramatic difference. If I had realised that this was the problem, I
would have done it long ago.
Thanks again,
--
Michael Salem
John Bell wrote:
> Hi
> Check out DTS as this is a more common way of doing it. See Books online a
nd
> http://www.sqldts.com/default.aspx for information regarding how to use
> this.
> John
> "Michael Salem" <msnews@.ms3.org.uk> wrote in message
> news:MPG.1b44d4de9dc5ffee98968f@.msnews.microsoft.com...
>
>