Showing posts with label link. Show all posts
Showing posts with label link. Show all posts

Friday, March 23, 2012

Exclusive Join

I have a link table between a user table and a lookup that contains the following data

ID Role
-- --
92166 1
92166 11
92167 7
92167 11
92210 3
92210 7
92210 11
92211 7
92211 11
92212 7
92212 11
92213 7
92213 11

92213 3

The Link table stores whether a user has a combination of rolea. I need to be able to ask for any given user IE 92210 has the roles 3 and 7 any additional roles are ok but they must have all of the combinations I am asking for. The role combinations can be anywhere from 1 to 40. So in one case i have to ask for roles (3,7) and in another (1,7,11) The only way I can figure out how to do this is a dynamic self join or by using a cursor (yuck).

This is a legacy application and the requirement changed from roles being inclusive to exclusive and a table structure change is out of the question.

There must be a better way. Any help would be appreciated.

Thanks in advance

You could do something like joining to the passed array with using a split function which is wrote sometime ago:

CREATE FUNCTION dbo.Split
(
@.String VARCHAR(200),
@.Delimiter VARCHAR(5)
)
RETURNS @.SplittedValues TABLE
(
OccurenceId SMALLINT IDENTITY(1,1),
SplitValue VARCHAR(200)
)
AS
BEGIN
DECLARE @.SplitLength INT

WHILE LEN(@.String) > 0
BEGIN
SELECT @.SplitLength = (CASE CHARINDEX(@.Delimiter,@.String) WHEN 0 THEN
LEN(@.String) ELSE CHARINDEX(@.Delimiter,@.String) -1 END)

INSERT INTO @.SplittedValues
SELECT SUBSTRING(@.String,1,@.SplitLength)

SELECT @.String = (CASE (LEN(@.String) - @.SplitLength) WHEN 0 THEN ''
ELSE RIGHT(@.String, LEN(@.String) - @.SplitLength - 1) END)
END
RETURN
END

This evaluates to:


DELCARE @.SomeValues VARCHAR(100)
SET @.SomeValues = '1,7,11'

SELECT 'Access Granted'
FROM SomeTable ST
INNER JOIN dbo.Split(@.SomeValues,',') S ON
ON S.SplitValue = ST.Role
WHERE ID = 95321
HAVING COUNT(*) = (SELECT COUNT(*) FROM dbo.Split(SomeValues,','))

This is untested and maybe need to be modified, try to use that in the case you don′t wanna use dynamic sql.


HTH, Jens Suessmeyer.


http://www.sqlserver2005.de

|||

Thanks, that put me on the right track and now have a udf that does the job

sql

Monday, March 19, 2012

Exchange Server as a LINKED SERVER

I have successfuly link Exchange to my sql server...what i
am trying to do is fetch all the data of the Exchange and
archive it to an SQL database... my question is it
possible when i got all the information from exchange and
archive it... i can use that archive to restore my
exchange mailboxes or something similar? any suggestions
is highly appreciated...
I don't think so as it's not in the same format needed by Exchange.
You'd want to look at just backing up Exchange. For individual
mailboxes, you could look at brick level backup/restore.
-Sue
On Thu, 9 Sep 2004 01:51:13 -0700, "perry" <kenzai_chuz@.yahoo.com>
wrote:

>I have successfuly link Exchange to my sql server...what i
>am trying to do is fetch all the data of the Exchange and
>archive it to an SQL database... my question is it
>possible when i got all the information from exchange and
>archive it... i can use that archive to restore my
>exchange mailboxes or something similar? any suggestions
>is highly appreciated...
|||Yeah, I think you're right. But my problem is this. If i
use the ExMerge Utility of microsoft, it only permits me
to backup not more that 2gb of file? I am using E2K
server...any suggestions? Because our mails are larger
than 2GB because of the attachments and one more thing...
will ExMerge backup or archive the attachments?
|||No suggestions on the ExMerge issues. You'd probably be
better off posting questions on ExMerge in one of the
Exchange newsgroups. Try:
microsoft.public.exchange2000.admin
-Sue
On Sun, 12 Sep 2004 17:55:14 -0700,
<anonymous@.discussions.microsoft.com> wrote:

>Yeah, I think you're right. But my problem is this. If i
>use the ExMerge Utility of microsoft, it only permits me
>to backup not more that 2gb of file? I am using E2K
>server...any suggestions? Because our mails are larger
>than 2GB because of the attachments and one more thing...
>will ExMerge backup or archive the attachments?

Monday, March 12, 2012

Excessive network usage with SQL 2000 to SQL 2005 transactional replication

We have a SQL2000 database (Publisher) replicating inserts and updates across a 10Mb link to a SQL 2005 database (Subscriber). The Publisher has two tables we are interested in, 1 with 50 columns and 1 with 15. Both tables have 6 insert/update triggers that fire when a change is made to update columns on the publisher database.

We have set up a pull transactional replication from the Subscriber to occur against the Publisher every minute. We have limited the subscription/replication configuration to Publsih 6 columns from table 1 and 4 from table 2. Any change occuring on any other columns in the Publisher are of no interest. The SQL 2005 database has a trigger on table 1 and table 2 to insert values into a third table. There are around 7,000 insert/updates on table 1 and 28,000 on table 2 per day. All fields in the tables are text.

We are seeing "excessive" network traffic occuring of approximately 1MB per minute (approx 2GB per 24 hrs). We also see that the Distributor databases are getting very large -- upto around 30GB and growing until they get culled. We have reduced the culling intrval from 72 hrs to 24 hours to reduce the size.

Does anyone have any suggestions as to how this "excessive" network traffic can be minimised and how the distributor database size can be minimised. I think that maybe they are both related?

Thanks,

Geoff

WA POLICE

is this merge or tran?

|||

i will guess it's merge replication. MErge replication does transfer a lot of metadata over the wire, you can see what's being replicated by running a profiler trace.

|||

sorry, i totally misread your thread, you do have transactional replication. You need to understand the data changes you are making at the publisher and what is getting replicated. Every change made to a table that is enabled for replication will get marked as a change that needs to be replicated. These changes, one by one, are inserted into the distribution database. So if your database is growing to 30 GB, you must either be doing a lot of inserts or updates, or you might be touching BLOB columns as well. You should investigate exactly what you're replicating to understand your scenario.

|||Thanks for the information Gerg. Yes we do have transactional and yes we have determined that the traffic seems to be related to the triggers on the publisher. We are looking at ways to limit these at present without impacting the business functions that use that database.

Excessive network usage with SQL 2000 to SQL 2005 transactional replication

We have a SQL2000 database (Publisher) replicating inserts and updates across a 10Mb link to a SQL 2005 database (Subscriber). The Publisher has two tables we are interested in, 1 with 50 columns and 1 with 15. Both tables have 6 insert/update triggers that fire when a change is made to update columns on the publisher database.

We have set up a pull transactional replication from the Subscriber to occur against the Publisher every minute. We have limited the subscription/replication configuration to Publsih 6 columns from table 1 and 4 from table 2. Any change occuring on any other columns in the Publisher are of no interest. The SQL 2005 database has a trigger on table 1 and table 2 to insert values into a third table. There are around 7,000 insert/updates on table 1 and 28,000 on table 2 per day. All fields in the tables are text.

We are seeing "excessive" network traffic occuring of approximately 1MB per minute (approx 2GB per 24 hrs). We also see that the Distributor databases are getting very large -- upto around 30GB and growing until they get culled. We have reduced the culling intrval from 72 hrs to 24 hours to reduce the size.

Does anyone have any suggestions as to how this "excessive" network traffic can be minimised and how the distributor database size can be minimised. I think that maybe they are both related?

Thanks,

Geoff

WA POLICE

is this merge or tran?

|||

i will guess it's merge replication. MErge replication does transfer a lot of metadata over the wire, you can see what's being replicated by running a profiler trace.

|||

sorry, i totally misread your thread, you do have transactional replication. You need to understand the data changes you are making at the publisher and what is getting replicated. Every change made to a table that is enabled for replication will get marked as a change that needs to be replicated. These changes, one by one, are inserted into the distribution database. So if your database is growing to 30 GB, you must either be doing a lot of inserts or updates, or you might be touching BLOB columns as well. You should investigate exactly what you're replicating to understand your scenario.

|||Thanks for the information Gerg. Yes we do have transactional and yes we have determined that the traffic seems to be related to the triggers on the publisher. We are looking at ways to limit these at present without impacting the business functions that use that database.

Excessive network usage with SQL 2000 to SQL 2005 transactional replication

We have a SQL2000 database (Publisher) replicating inserts and updates across a 10Mb link to a SQL 2005 database (Subscriber). The Publisher has two tables we are interested in, 1 with 50 columns and 1 with 15. Both tables have 6 insert/update triggers that fire when a change is made to update columns on the publisher database.

We have set up a pull transactional replication from the Subscriber to occur against the Publisher every minute. We have limited the subscription/replication configuration to Publsih 6 columns from table 1 and 4 from table 2. Any change occuring on any other columns in the Publisher are of no interest. The SQL 2005 database has a trigger on table 1 and table 2 to insert values into a third table. There are around 7,000 insert/updates on table 1 and 28,000 on table 2 per day. All fields in the tables are text.

We are seeing "excessive" network traffic occuring of approximately 1MB per minute (approx 2GB per 24 hrs). We also see that the Distributor databases are getting very large -- upto around 30GB and growing until they get culled. We have reduced the culling intrval from 72 hrs to 24 hours to reduce the size.

Does anyone have any suggestions as to how this "excessive" network traffic can be minimised and how the distributor database size can be minimised. I think that maybe they are both related?

Thanks,

Geoff

WA POLICE

is this merge or tran?

|||

i will guess it's merge replication. MErge replication does transfer a lot of metadata over the wire, you can see what's being replicated by running a profiler trace.

|||

sorry, i totally misread your thread, you do have transactional replication. You need to understand the data changes you are making at the publisher and what is getting replicated. Every change made to a table that is enabled for replication will get marked as a change that needs to be replicated. These changes, one by one, are inserted into the distribution database. So if your database is growing to 30 GB, you must either be doing a lot of inserts or updates, or you might be touching BLOB columns as well. You should investigate exactly what you're replicating to understand your scenario.

|||Thanks for the information Gerg. Yes we do have transactional and yes we have determined that the traffic seems to be related to the triggers on the publisher. We are looking at ways to limit these at present without impacting the business functions that use that database.

Excessive network usage with SQL 2000 to SQL 2005 transactional replication

We have a SQL2000 database (Publisher) replicating inserts and updates across a 10Mb link to a SQL 2005 database (Subscriber). The Publisher has two tables we are interested in, 1 with 50 columns and 1 with 15. Both tables have 6 insert/update triggers that fire when a change is made to update columns on the publisher database.

We have set up a pull transactional replication from the Subscriber to occur against the Publisher every minute. We have limited the subscription/replication configuration to Publsih 6 columns from table 1 and 4 from table 2. Any change occuring on any other columns in the Publisher are of no interest. The SQL 2005 database has a trigger on table 1 and table 2 to insert values into a third table. There are around 7,000 insert/updates on table 1 and 28,000 on table 2 per day. All fields in the tables are text.

We are seeing "excessive" network traffic occuring of approximately 1MB per minute (approx 2GB per 24 hrs). We also see that the Distributor databases are getting very large -- upto around 30GB and growing until they get culled. We have reduced the culling intrval from 72 hrs to 24 hours to reduce the size.

Does anyone have any suggestions as to how this "excessive" network traffic can be minimised and how the distributor database size can be minimised. I think that maybe they are both related?

Thanks,

Geoff

WA POLICE

is this merge or tran?

|||

i will guess it's merge replication. MErge replication does transfer a lot of metadata over the wire, you can see what's being replicated by running a profiler trace.

|||

sorry, i totally misread your thread, you do have transactional replication. You need to understand the data changes you are making at the publisher and what is getting replicated. Every change made to a table that is enabled for replication will get marked as a change that needs to be replicated. These changes, one by one, are inserted into the distribution database. So if your database is growing to 30 GB, you must either be doing a lot of inserts or updates, or you might be touching BLOB columns as well. You should investigate exactly what you're replicating to understand your scenario.

|||

Thanks for the information Gerg. Yes we do have transactional and yes we have determined that the traffic seems to be related to the triggers on the publisher. We are looking at ways to limit these at present without impacting the business functions that use that database.

Wednesday, February 15, 2012

Excel ODBC Problem

Hello,

I created an excel sheet I want to link to, to get data from. But now I'm at a loss as to how to actually make this sheet work in Crystal. Help!

I know I have to set up a new ODBC connection for the spreadsheet I created, which I did.

When I select the spreadsheet as a new connection, it comes up with a blank table (no fields).

What did I do wrong?create a new connection in crystal using
access/excel DAO|||This link may help

http://technicalsupport.businessobjects.com/KanisaSupportSite/search.do?cmd=displayKC&docType=kc&externalId=c2001842&sliceId=&dialogID=3630043&stateId=1%200%203628127

Rashmi