Question : Shell script to load log file into database

Dear Experts, can you please help me with a shell script..

Part I

I have a log file, which keeps growing everyday and its ftp'd into /var/tmp in one of the unix box's

1.  Check if  .csv file has arrived in /var/tmp

2.  If exist bcp .csv into the table

3.  Since the same  file grows everyday, I dont want to load same fields everyday, I need to load only fields which doesnt exist in the table..

sample data:
1,08/31/2005 08:09:21,117038,login,
2,08/31/2005 09:01:55,116726,login,
3,08/31/2005 09:02:41,116726,query,7310
4,08/31/2005 09:03:06,,login,
5,08/31/2005 09:03:12,,query,7310
6,08/31/2005 09:03:22,,show,12.0.21091459.3462704
7,08/31/2005 09:03:28,,query,7310
8,09/02/2005 07:30:52,116726,login,
9,09/02/2005 07:36:58,125290,login,
42,09/02/2005 08:53:53,112024,login,
43,09/05/2005 09:02:34,1087,login,
44,09/05/2005 09:03:04,1087,query,octel
45,09/06/2005 12:45:57,,login,
46,09/06/2005 12:45:59,012363,login,
47,09/06/2005 12:46:31,010287,login,
48,09/06/2005 12:47:32,,login,

The dates might not be today's date, so how do i validate if the field exist in table or not... The first column is always an unique number. These record numbers will not necessarily begin with 1 or even be ordered. So can you tell me how to validate with date field to avoid duplicate entries in table and how to validate without any unique field before bcp'ing data into table.


Part II

This file exist on a windows server,  how do I automate the ftp'in process into unix box  under /var/tmp/


Appreciate your help !

Thanks
Kay.

Answer : Shell script to load log file into database

Answering in reverse order:
2.

> bcp alerts2.dbo.MobileLog in  mobilelog.csv -Usa -Ppassword -Sserver -c -t"'" -F1 -L5

This is wrong. The "-t" option specifies the column separator character, which from your data and what you've told us is a comma. That's why my sample bcp command line was:

     bcp .. in -U -S -P -c -t,

ie. replace -t"'" with -t,


1.
Bcp doesn't do any validation. It just inserts rows. So it doesn't really matter whether you do or don't have a unique column, you're going to have to do the validation separately to the bcp.

One approach could be to create a unique index on the table, then do your bcp in with a transactional batch size of 1. (Use -b1) This treats each row inserted via bcp as its own transaction, so if one row is a duplicate, it will fail and be rejected, but this won't affect any of the other rows from being inserted.

Bret made the point that you could use the XFS option (only available in ASE 12.5 and above), which lets you treat a regularly formatted file as though it were a table. Then you could do this entirely in SQL.
Random Solutions  
 
programming4us programming4us