Showing posts with label sql2000. Show all posts
Showing posts with label sql2000. Show all posts

Monday, March 26, 2012

Exec permissions and selecting from other databases

I have SQL2000 with a couple of databases on it (same instance) - e.g. DB1 a
nd DB2 say.
In DB2 I have assiged EXEC Permissions for a user on a Stored Proc that chec
ks against a table in DB1 (e.g.)
--In DB2 Stored proc
SELECT * FROM Customer
WHERE CustomerID IN (SELECT CustomerID FROM DB1.dbo.OtherCustomers)
However the user gets an error stating that they do not have select permissi
ons on the other table (Othercustomers in db1)
Is there any way I can assign permissions just on this stored procedure (in
db2) without giving select permissions
for the user in DB1 ?
Thanks
GerryYour objects need to be owned by the same login in order to maintain an
unbroken ownership chain so that users don't need permissions on indirectly
referenced objects. You can map the 'dbo' user to the same login in both
databases by specifying the same database owner. For example:
USE DB1
EXEC sp_changedbowner 'MyLogin'
GO
USE DB2
EXEC sp_changedbowner 'MyLogin'
GO
Additionally, you need to enable cross-database chaining in both databases.
This option was introduced in SQL 2000 SP3. For example:
EXEC sp_dboption 'DB1', 'db chaining', true
EXEC sp_dboption 'DB2', 'db chaining', true
See the cross-database ownership chaining in the Books Online for details.
Hope this helps.
Dan Guzman
SQL Server MVP
"Gerry" <anonymous@.discussions.microsoft.com> wrote in message
news:0108492C-77C3-4694-9F2E-C824675D7B03@.microsoft.com...
quote:

> I have SQL2000 with a couple of databases on it (same instance) - e.g. DB1

and DB2 say.
quote:

> In DB2 I have assiged EXEC Permissions for a user on a Stored Proc that

checks against a table in DB1 (e.g.)
quote:

> --In DB2 Stored proc
> SELECT * FROM Customer
> WHERE CustomerID IN (SELECT CustomerID FROM DB1.dbo.OtherCustomers)
> However the user gets an error stating that they do not have select

permissions on the other table (Othercustomers in db1)
quote:

> Is there any way I can assign permissions just on this stored procedure

(in db2) without giving select permissions
quote:

> for the user in DB1 ?
> Thanks
> Gerry
>
sql

Exec a resultset to .txt file directly.

Hi all.
Is there a way I can have my resultset saved to a file directly from a SP
using SQL2000.
E.g.
use northwind
CREATE PROCEDURE ToTxt
AS
SELECT *
FROM Customers
/* I want something like*/
/*
SELECT *
FROM Customers
TO c:\Cust.txt
*/
/* Then I can run the following to create the file */
Exec ToTxt
Thanx all.
ghNo, One way is to use xp_cmdshell to execute either OSQL or BCP to output th
e text file.
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
Blog: http://solidqualitylearning.com/blogs/tibor/
"Geir Holme" <geir@.multicase.no> wrote in message news:Odb$wT9JGHA.1312@.TK2MSFTNGP09.phx.gb
l...
> Hi all.
> Is there a way I can have my resultset saved to a file directly from a SP
> using SQL2000.
> E.g.
> use northwind
> CREATE PROCEDURE ToTxt
> AS
> SELECT *
> FROM Customers
> /* I want something like*/
> /*
> SELECT *
> FROM Customers
> TO c:\Cust.txt
> */
> /* Then I can run the following to create the file */
> Exec ToTxt
> Thanx all.
> gh
>
>

Exec a resultset to .txt file directly.

Hi all.
Is there a way I can have my resultset saved to a file directly from a SP
using SQL2000.
E.g.
use northwind
CREATE PROCEDURE ToTxt
AS
SELECT *
FROM Customers
/* I want something like*/
/*
SELECT *
FROM Customers
TO c:\Cust.txt
*/
/* Then I can run the following to create the file */
Exec ToTxt
Thanx all.
ghTry something like :
EXEC master..xp_cmdshell 'osql.exe -S MyServer -U MyUserName -P MyPassword -
Q "EXEC storedProcedure" -o "C:\savedResultset.txt"'
--
Jack Vamvas
________________________________________
__________________________
Receive free SQL tips - register at www.ciquery.com/sqlserver.htm
SQL Server Performance Audit - check www.ciquery.com/sqlserver_audit.htm
New article by Jack Vamvas - SQL and Markov Chains - www.ciquery.com/articles/art_
04.asp
"Geir Holme" <geir@.multicase.no> wrote in message news:%23rEX5SyJGHA.2808@.TK2MSFTNGP15.phx.
gbl...
> Hi all.
> Is there a way I can have my resultset saved to a file directly from a SP
> using SQL2000.
>
> E.g.
>
> use northwind
> CREATE PROCEDURE ToTxt
> AS
> SELECT *
> FROM Customers
> /* I want something like*/
> /*
> SELECT *
> FROM Customers
> TO c:\Cust.txt
> */
>
> /* Then I can run the following to create the file */
> Exec ToTxt
>
> Thanx all.
>
> gh
>
>|||It would be something like
e.g.: XP_cmdshell 'OSQL -E -Q"SELECT * FROM Customers" -O"C:\Cust.txt"'
paramters may vary on your server settings.
HTH, Jens Suessmeyer.sql

Friday, March 23, 2012

exclusive lock

(pls correct if u feel anything here is incorrect)
i know in SQL2000, locks are well managed internally, and many DML/DDL
statements will auto use lock.
just want to know, why and how a DBA will need to exclusively use lock,
under what kind of situations ?
tks for sharing.you may change locking behaviour by changing isolation levels for the
connection
SET TRANSACTION ISOLATION LEVEL SERIALIZABLE (others doc'd in books on
line)
You may also change locking behaviour for a single table in a single
transaction using lock hints ie
Select * from titles with ( nolock)
In my experience the most common two reasons for adjusting locking is
1. to increase locking so that no-one can touch the data until you are
through with it... Perhaps you are doing some kind of adjustments to the
money fields of many rows ( like cost allocation)... Any changes made by
others during your process will skew your work and make it inaccurate...
Therefore you may choose to increase the locking level to serializable so
no-one else can touch the data until you are through..
Another similar case is when you are doing a select ( but intend to
update later) and do not wish the values to change, you might select with
holdlock.
2. the second reason is that you wish to read some data and NOT be blocked
by updaters. You may choose to ignore exclusive locks and read dirty data.
The select above will do that..
Hope this helps.
Wayne Snyder, MCDBA, SQL Server MVP
Computer Education Services Corporation (CESC), Charlotte, NC
www.computeredservices.com
(Please respond only to the newsgroups.)
I support the Professional Association of SQL Server (PASS) and it's
community of SQL Server professionals.
www.sqlpass.org
"pk" <pk@.> wrote in message news:utAG1ewEEHA.2768@.tk2msftngp13.phx.gbl...
> (pls correct if u feel anything here is incorrect)
> i know in SQL2000, locks are well managed internally, and many DML/DDL
> statements will auto use lock.
> just want to know, why and how a DBA will need to exclusively use lock,
> under what kind of situations ?
> tks for sharing.
>

Monday, March 12, 2012

Excessive network usage with SQL 2000 to SQL 2005 transactional replication

We have a SQL2000 database (Publisher) replicating inserts and updates across a 10Mb link to a SQL 2005 database (Subscriber). The Publisher has two tables we are interested in, 1 with 50 columns and 1 with 15. Both tables have 6 insert/update triggers that fire when a change is made to update columns on the publisher database.

We have set up a pull transactional replication from the Subscriber to occur against the Publisher every minute. We have limited the subscription/replication configuration to Publsih 6 columns from table 1 and 4 from table 2. Any change occuring on any other columns in the Publisher are of no interest. The SQL 2005 database has a trigger on table 1 and table 2 to insert values into a third table. There are around 7,000 insert/updates on table 1 and 28,000 on table 2 per day. All fields in the tables are text.

We are seeing "excessive" network traffic occuring of approximately 1MB per minute (approx 2GB per 24 hrs). We also see that the Distributor databases are getting very large -- upto around 30GB and growing until they get culled. We have reduced the culling intrval from 72 hrs to 24 hours to reduce the size.

Does anyone have any suggestions as to how this "excessive" network traffic can be minimised and how the distributor database size can be minimised. I think that maybe they are both related?

Thanks,

Geoff

WA POLICE

is this merge or tran?

|||

i will guess it's merge replication. MErge replication does transfer a lot of metadata over the wire, you can see what's being replicated by running a profiler trace.

|||

sorry, i totally misread your thread, you do have transactional replication. You need to understand the data changes you are making at the publisher and what is getting replicated. Every change made to a table that is enabled for replication will get marked as a change that needs to be replicated. These changes, one by one, are inserted into the distribution database. So if your database is growing to 30 GB, you must either be doing a lot of inserts or updates, or you might be touching BLOB columns as well. You should investigate exactly what you're replicating to understand your scenario.

|||Thanks for the information Gerg. Yes we do have transactional and yes we have determined that the traffic seems to be related to the triggers on the publisher. We are looking at ways to limit these at present without impacting the business functions that use that database.

Excessive network usage with SQL 2000 to SQL 2005 transactional replication

We have a SQL2000 database (Publisher) replicating inserts and updates across a 10Mb link to a SQL 2005 database (Subscriber). The Publisher has two tables we are interested in, 1 with 50 columns and 1 with 15. Both tables have 6 insert/update triggers that fire when a change is made to update columns on the publisher database.

We have set up a pull transactional replication from the Subscriber to occur against the Publisher every minute. We have limited the subscription/replication configuration to Publsih 6 columns from table 1 and 4 from table 2. Any change occuring on any other columns in the Publisher are of no interest. The SQL 2005 database has a trigger on table 1 and table 2 to insert values into a third table. There are around 7,000 insert/updates on table 1 and 28,000 on table 2 per day. All fields in the tables are text.

We are seeing "excessive" network traffic occuring of approximately 1MB per minute (approx 2GB per 24 hrs). We also see that the Distributor databases are getting very large -- upto around 30GB and growing until they get culled. We have reduced the culling intrval from 72 hrs to 24 hours to reduce the size.

Does anyone have any suggestions as to how this "excessive" network traffic can be minimised and how the distributor database size can be minimised. I think that maybe they are both related?

Thanks,

Geoff

WA POLICE

is this merge or tran?

|||

i will guess it's merge replication. MErge replication does transfer a lot of metadata over the wire, you can see what's being replicated by running a profiler trace.

|||

sorry, i totally misread your thread, you do have transactional replication. You need to understand the data changes you are making at the publisher and what is getting replicated. Every change made to a table that is enabled for replication will get marked as a change that needs to be replicated. These changes, one by one, are inserted into the distribution database. So if your database is growing to 30 GB, you must either be doing a lot of inserts or updates, or you might be touching BLOB columns as well. You should investigate exactly what you're replicating to understand your scenario.

|||Thanks for the information Gerg. Yes we do have transactional and yes we have determined that the traffic seems to be related to the triggers on the publisher. We are looking at ways to limit these at present without impacting the business functions that use that database.

Excessive network usage with SQL 2000 to SQL 2005 transactional replication

We have a SQL2000 database (Publisher) replicating inserts and updates across a 10Mb link to a SQL 2005 database (Subscriber). The Publisher has two tables we are interested in, 1 with 50 columns and 1 with 15. Both tables have 6 insert/update triggers that fire when a change is made to update columns on the publisher database.

We have set up a pull transactional replication from the Subscriber to occur against the Publisher every minute. We have limited the subscription/replication configuration to Publsih 6 columns from table 1 and 4 from table 2. Any change occuring on any other columns in the Publisher are of no interest. The SQL 2005 database has a trigger on table 1 and table 2 to insert values into a third table. There are around 7,000 insert/updates on table 1 and 28,000 on table 2 per day. All fields in the tables are text.

We are seeing "excessive" network traffic occuring of approximately 1MB per minute (approx 2GB per 24 hrs). We also see that the Distributor databases are getting very large -- upto around 30GB and growing until they get culled. We have reduced the culling intrval from 72 hrs to 24 hours to reduce the size.

Does anyone have any suggestions as to how this "excessive" network traffic can be minimised and how the distributor database size can be minimised. I think that maybe they are both related?

Thanks,

Geoff

WA POLICE

is this merge or tran?

|||

i will guess it's merge replication. MErge replication does transfer a lot of metadata over the wire, you can see what's being replicated by running a profiler trace.

|||

sorry, i totally misread your thread, you do have transactional replication. You need to understand the data changes you are making at the publisher and what is getting replicated. Every change made to a table that is enabled for replication will get marked as a change that needs to be replicated. These changes, one by one, are inserted into the distribution database. So if your database is growing to 30 GB, you must either be doing a lot of inserts or updates, or you might be touching BLOB columns as well. You should investigate exactly what you're replicating to understand your scenario.

|||Thanks for the information Gerg. Yes we do have transactional and yes we have determined that the traffic seems to be related to the triggers on the publisher. We are looking at ways to limit these at present without impacting the business functions that use that database.

Excessive network usage with SQL 2000 to SQL 2005 transactional replication

We have a SQL2000 database (Publisher) replicating inserts and updates across a 10Mb link to a SQL 2005 database (Subscriber). The Publisher has two tables we are interested in, 1 with 50 columns and 1 with 15. Both tables have 6 insert/update triggers that fire when a change is made to update columns on the publisher database.

We have set up a pull transactional replication from the Subscriber to occur against the Publisher every minute. We have limited the subscription/replication configuration to Publsih 6 columns from table 1 and 4 from table 2. Any change occuring on any other columns in the Publisher are of no interest. The SQL 2005 database has a trigger on table 1 and table 2 to insert values into a third table. There are around 7,000 insert/updates on table 1 and 28,000 on table 2 per day. All fields in the tables are text.

We are seeing "excessive" network traffic occuring of approximately 1MB per minute (approx 2GB per 24 hrs). We also see that the Distributor databases are getting very large -- upto around 30GB and growing until they get culled. We have reduced the culling intrval from 72 hrs to 24 hours to reduce the size.

Does anyone have any suggestions as to how this "excessive" network traffic can be minimised and how the distributor database size can be minimised. I think that maybe they are both related?

Thanks,

Geoff

WA POLICE

is this merge or tran?

|||

i will guess it's merge replication. MErge replication does transfer a lot of metadata over the wire, you can see what's being replicated by running a profiler trace.

|||

sorry, i totally misread your thread, you do have transactional replication. You need to understand the data changes you are making at the publisher and what is getting replicated. Every change made to a table that is enabled for replication will get marked as a change that needs to be replicated. These changes, one by one, are inserted into the distribution database. So if your database is growing to 30 GB, you must either be doing a lot of inserts or updates, or you might be touching BLOB columns as well. You should investigate exactly what you're replicating to understand your scenario.

|||

Thanks for the information Gerg. Yes we do have transactional and yes we have determined that the traffic seems to be related to the triggers on the publisher. We are looking at ways to limit these at present without impacting the business functions that use that database.

Friday, February 24, 2012

Excel XP and SQL Server... a good match?

Suppose I have an appliation made in VB6 which connects to SQL2000.
The applications creates a DSN(old style) to connecto to the server.
The user has a login an a password to use the application. The server
validades Windows NT login.
Every effort had been made to protect the database from intruders.
So far so good...
Next day, comes Clarck Kent and connects to the server simply by using the DSN
and Excel 2000!!! He can browse, update and delete files!!!
How can I avoid such intruder.
Thank you... waitng for replies...
Rick
use stored procedures and validate within the procedures with app_name()
http://sqlservercode.blogspot.com/
"Rick" wrote:

> Suppose I have an appliation made in VB6 which connects to SQL2000.
> The applications creates a DSN(old style) to connecto to the server.
> The user has a login an a password to use the application. The server
> validades Windows NT login.
> Every effort had been made to protect the database from intruders.
> So far so good...
> Next day, comes Clarck Kent and connects to the server simply by using the DSN
> and Excel 2000!!! He can browse, update and delete files!!!
> How can I avoid such intruder.
> Thank you... waitng for replies...
> --
> Rick
|||On Wed, 21 Sep 2005 12:20:08 -0700, "Rick"
<Rick@.discussions.microsoft.com> wrote:
>Next day, comes Clarck Kent and connects to the server simply by using the DSN
>and Excel 2000!!! He can browse, update and delete files!!!
>How can I avoid such intruder.
See "application roles" in BOL.
In other words, grant only limited and/or read access to the Windows
login, require another user or application login for updates and such.
J.

Excel XP and SQL Server... a good match?

Suppose I have an appliation made in VB6 which connects to SQL2000.
The applications creates a DSN(old style) to connecto to the server.
The user has a login an a password to use the application. The server
validades Windows NT login.
Every effort had been made to protect the database from intruders.
So far so good...
Next day, comes Clarck Kent and connects to the server simply by using the D
SN
and Excel 2000!!! He can browse, update and delete files!!!
How can I avoid such intruder.
Thank you... waitng for replies...
Rickuse stored procedures and validate within the procedures with app_name()
http://sqlservercode.blogspot.com/
"Rick" wrote:

> Suppose I have an appliation made in VB6 which connects to SQL2000.
> The applications creates a DSN(old style) to connecto to the server.
> The user has a login an a password to use the application. The server
> validades Windows NT login.
> Every effort had been made to protect the database from intruders.
> So far so good...
> Next day, comes Clarck Kent and connects to the server simply by using the
DSN
> and Excel 2000!!! He can browse, update and delete files!!!
> How can I avoid such intruder.
> Thank you... waitng for replies...
> --
> Rick|||On Wed, 21 Sep 2005 12:20:08 -0700, "Rick"
<Rick@.discussions.microsoft.com> wrote:
>Next day, comes Clarck Kent and connects to the server simply by using the
DSN
>and Excel 2000!!! He can browse, update and delete files!!!
>How can I avoid such intruder.
See "application roles" in BOL.
In other words, grant only limited and/or read access to the Windows
login, require another user or application login for updates and such.
J.

Excel XP and SQL Server... a good match?

Suppose I have an appliation made in VB6 which connects to SQL2000.
The applications creates a DSN(old style) to connecto to the server.
The user has a login an a password to use the application. The server
validades Windows NT login.
Every effort had been made to protect the database from intruders.
So far so good...
Next day, comes Clarck Kent and connects to the server simply by using the DSN
and Excel 2000!!! He can browse, update and delete files!!!
How can I avoid such intruder.
Thank you... waitng for replies...
--
Rickuse stored procedures and validate within the procedures with app_name()
http://sqlservercode.blogspot.com/
"Rick" wrote:
> Suppose I have an appliation made in VB6 which connects to SQL2000.
> The applications creates a DSN(old style) to connecto to the server.
> The user has a login an a password to use the application. The server
> validades Windows NT login.
> Every effort had been made to protect the database from intruders.
> So far so good...
> Next day, comes Clarck Kent and connects to the server simply by using the DSN
> and Excel 2000!!! He can browse, update and delete files!!!
> How can I avoid such intruder.
> Thank you... waitng for replies...
> --
> Rick|||On Wed, 21 Sep 2005 12:20:08 -0700, "Rick"
<Rick@.discussions.microsoft.com> wrote:
>Next day, comes Clarck Kent and connects to the server simply by using the DSN
>and Excel 2000!!! He can browse, update and delete files!!!
>How can I avoid such intruder.
See "application roles" in BOL.
In other words, grant only limited and/or read access to the Windows
login, require another user or application login for updates and such.
J.