Is anyone working in ICM for small business?

Security, malware, viri--you've got to keep your eyes open

Is anyone working in ICM for small business?

Postby ITinaGhetto » 12/31/07, 7:58 pm

ICM is relatively new acronym for Information Content Management. The commercial tool that exemplifies what I mean is Abrevity. It costs over $15K. For small business, less than 2 Terabytes of data, one might tackle this concept in shell scripting, mySQL, etc. My firm is a non-profit of this size.
ITinaGhetto
Member ***
 
Posts: 16
Joined: 06/15/04, 12:00 am
Location: San Francisco, CA, USA

Postby RedRage » 01/01/08, 6:03 pm

ya we have been looking at tools to do it for us. but we settled on using file shares and folders through samba with shell scripts and such.

we are only working with about 300GB of data though and that stays pretty much constant after weekly clean ups of old data

it is really not that bad we do have a problem from time to time with XP's file locking that we cannot seem to figure out. It just locks files and we need to go into the server as root to rename or remove them. only happens with xp though. duno why.
Image
RedRage
BIG GIANT HEAD I Get Free Beer
BIG GIANT HEAD I Get Free Beer
 
Posts: 1541
Joined: 12/04/01, 12:00 am

What is "critical path" resource?

Postby ITinaGhetto » 01/01/08, 11:06 pm

RedRage,

What is the critical resource in your Samba based solution? By this I mean, what do most need more of? For example, runtime for the scripts (similar to backup windows), disk storage for the results of your scans and analysis, temporary storage (shadow copy uses a lot of unannounced temporary disk space).

I don't have an excellent fix for your locked files issue. I know we have a lot of users who do not log off at end of day and Windows 2003 Server accumulates a few old sessions sometimes. The user is forced off due to upgrade / forced restart / files are not closed cleanly.

Thank you,
IT in a Ghetto
ITinaGhetto
Member ***
 
Posts: 16
Joined: 06/15/04, 12:00 am
Location: San Francisco, CA, USA

Postby RedRage » 01/02/08, 6:33 pm

I would say network bandwidth. When i got here the network was pretty much setup and to redo the whole darn thing would be more trouble than its worth LOL.

there is one day a week where a script makes a complete compressed tar that takes about 33hours give or take, other than that it runs great.

here is a basic rundown of the server setup:
one Dual core Dell, 2GB ram 700Gig Storage with Dual gb network. (only one used)
One Quad Core Dell with the same basic setup.
both with redhat enterprise

every 6 hours, rsync copies updated files from primary server to the backup. first time took all day, but it usually runs less than 5 minutes now.

we clean up temp files manually every week (only manually so we have something to do on fridays). Then the tar script runs on saturday morning and runs through sunday usually about the time people start showing up to work on the next days paper.

I thought the locking was cause by samba at first but shutting it down and killing the users open files didn't work. eventually we just gave up and have them tell us when its locked since it only happens once in a while.

One thing i would like to add on to the servers is Striping to speed up disc access but it is not a problem really just a little thing i would like to do.

Our other setup for the sales people have a similar setup except the syncing takes place on the 2nd network card which is connected directly to the back up server. and backs up constantly pretty much, and the large backups are kept on tape since it is more business critical (accounting stuff i try to avoid)
Image
RedRage
BIG GIANT HEAD I Get Free Beer
BIG GIANT HEAD I Get Free Beer
 
Posts: 1541
Joined: 12/04/01, 12:00 am

Postby SOD » 01/02/08, 9:50 pm

All that scripted backup. Is that better than a raid config?

Just wondering going to implement sys backup at home.

Gots two dev desktops one server/sandbox along with a
wifi laptop..Is there anyway to raid these to one hd?
It is better to be here than there - SOD
SOD
BIG GIANT HEAD I Get Free Beer
BIG GIANT HEAD I Get Free Beer
 
Posts: 5284
Joined: 12/06/01, 12:00 am
Location: here and there

Postby bob » 01/03/08, 2:21 am

Aren't the backups kept off site, Red?
WYSIWTF
bob
BIG GIANT HEAD I Get Free Beer
BIG GIANT HEAD I Get Free Beer
 
Posts: 7565
Joined: 12/03/01, 12:00 am
Location: St. Louis

Postby RedRage » 01/04/08, 3:55 am

they won't splurge for the huge tape drives (let alone the media).

I nearly had them talked into a couple external usb drives but no luck (yet)

I do keep all the config files in on me on my cell phone lol and another guy keeps them on his usb key and i have some of my test ones (that i've been tweaking) on my test box. so i can get the system back and running with in an hour or two of receiving new hardware.

I steal one of the other tapes ones a week.

I don't trust raid Mirror at all.. I've had very bad luck with mirroring unless you add parity
Image
RedRage
BIG GIANT HEAD I Get Free Beer
BIG GIANT HEAD I Get Free Beer
 
Posts: 1541
Joined: 12/04/01, 12:00 am

Is anyone working in ICM for small business?

Postby ITinaGhetto » 01/08/08, 12:25 am

My original question was about ICM (information content management). The thread has evolved in to issues of backup and system recovery. I'll put here what I have learned in past week about content management.

Most likely path of home-rolled ICM techniques, that is short of buying Abrevity, is Microsoft Search Server, currently in beta. Combined with some scripting; and a healthy dose of new disk space.

The search engine technologies, and there are several competitors as you know, are most likely to help with managing the files. For instance, find all files related to year-end financials and audits. Make it "nearly impossible" to remove them from the file system forever. Find all files done for month-end; nag a sys admin to remove these after 15 months. Etc.
ITinaGhetto
Member ***
 
Posts: 16
Joined: 06/15/04, 12:00 am
Location: San Francisco, CA, USA

Postby RedRage » 01/08/08, 3:53 pm

as for search...

I've used linux's locate command. by default it updates daily in cron (i think its the same still) but i set mine to run every 30mins.

Yes it adds quite a bit a load for a little while but if you got multiple cores and a speedy IO, you really don't notice much.

Pop it into a little script and you are set. Pretty much what everyone one else is doing just with a prettier interface.

google has device that does it too, never used it but it is google so it is cool :)
Image
RedRage
BIG GIANT HEAD I Get Free Beer
BIG GIANT HEAD I Get Free Beer
 
Posts: 1541
Joined: 12/04/01, 12:00 am


Return to Protect Yourself

Who is online

Users browsing this forum: No registered users and 2 guests

cron