In my previous post, I discussed one option of downloading, but NOT installing, required and marked packages for a RedHat Server. In the first Part, I included a script that would create a directory and download the rpm files to this directory specifically. I had tested this script on an RHEL 5.5 machine.
This post is continuation of the Part I and makes the assumption that the reader now has, in at least some fashion, downloaded the rpm files that are needed to patch a system that can't (or doesn't) touch the internet. This system is assumed to be mirrored by the system that we used to download patches. So....
Using the script from Part I, and today's date, we should have a folder /vpmt/updates/2011_01_04/ that contains all of the currently needed rpm files. What are we supposed to do with these files, other than just stare at them?
I am so glad you asked, and I hope that you are prepared for a long, LONG, drawn out answer. Please understand that there is a LOT of work required in updating an offline system from patches downloaded to a mirrored online system. So...
1. Copy the files to a disk
2. Copy the files from the disk to the offline system
3. Open a terminal window and navigate to the folder where you copied the rpm files to in step 2
4. Execute (as root, or with su) chmod 755 *.rpm
5. Execute (as root, or with su) rpm -Uvh *.rpm from the directory where the files were copied
...and those five steps, ALL five long and tedious steps, are all that should be required to install the patches that you downloaded on the first system. Now, I know what you are thinking: "What about all the dependencies that are bound to be present?" This is where the way rpm works and its options come into play.
More to follow about RPM.
Showing posts with label Repository. Show all posts
Showing posts with label Repository. Show all posts
Tuesday, January 4, 2011
Saturday, January 16, 2010
Moving Forward
The winter semester for my MSc. started this past Wednesday, and today I actually have some time to look at the posted class videos. I am a distance learning student through the University of Michigan - Dearborn. The two classes I am currently taking are: Database Systems and Web Services. Interestingly enough, these classes both relate to this crazy idea I have: a global repository of malware information (source, binaries, hashes, IP's, etc). So instead of listening completely to the current video I have, I am listening 40% and spending the rest of my concentration on this post. :-)
Some might ask: But isn't a global repository already available with sites such as wilw0rm.com, for example? Not to the level of detail I prefer, or think the security community could use. There are numerous organizations that focus solely on network security, or on attacking that security. Some of these are EmergingThreats, the MetaSploit project, SANS, etc. There are even numerous research groups (acadamia and business) that have their own internal data stores of vectors used. Then there are the legal and procedural requirements of governments and business that require the storage of data about attacks/compromises.
As an example: I was recenlty asked by another network security individual about some javascript that was extremely obfuscated. I was intrgued, worked on it for a bit, and then remembered that I know of some who have repositiores of obfuscated javascript. So I contacted them in hopes that they had already seen this particular vector. The result was that I didn't get help becuase the repository that these individuals had was/is controlled by an anti-virus vendor and the vendor was unwilling to share research results.
It would have been much more promising for the security industry if this other tech could have shared directly the code he found with those that controlled that particular repository. Maybe my buddy could've had an answer much quicker than what it ended up taking. However, and much more to the point, a globally available repository would have alleviated a lot of the headache for my buddy. It would relieve that headache for a lot of people.
What do I propose? I propose that a massive undertaking begin, where everyone pitches in, but I get all the credit. :-)
Seriously though: Why not take ALL the data each time your employer or home network is attacked, convert it to an XML file, and upload the data for everyone else to see (minus internal IP/connection data, user names, etc.). The logistics are harder than I think this sounds to most people, but this could be done. I can already hear some screaming, "But then even more 'would-be' attackers (read: fat, pimply-faced scriptkiddies) would be able to attack us." I answer with the belief that if we (the network security industry as a whole!) were to immediately publish this data for all to see, we reduce the attack landscape currently available to the bad guys. Signatures can be written, rules enabled, and the good guys score. The immediate down-side to this is that the bad guys would have to (and they WOULD) get smarter. However, if we had 10 times as many examples of a particular exploit, wouldn't signature AND behavior based detection be much more feasable, and effective? Of course, I could just be ranting and raving becuase it's a Saturday and college football is over! :-(
I am getting off of the soap-box in favor of doing some work and listening to one of my Prof's, and seeing if the Tiger's have made any more moves in preperation for Spring Training. :-)
Some might ask: But isn't a global repository already available with sites such as wilw0rm.com, for example? Not to the level of detail I prefer, or think the security community could use. There are numerous organizations that focus solely on network security, or on attacking that security. Some of these are EmergingThreats, the MetaSploit project, SANS, etc. There are even numerous research groups (acadamia and business) that have their own internal data stores of vectors used. Then there are the legal and procedural requirements of governments and business that require the storage of data about attacks/compromises.
As an example: I was recenlty asked by another network security individual about some javascript that was extremely obfuscated. I was intrgued, worked on it for a bit, and then remembered that I know of some who have repositiores of obfuscated javascript. So I contacted them in hopes that they had already seen this particular vector. The result was that I didn't get help becuase the repository that these individuals had was/is controlled by an anti-virus vendor and the vendor was unwilling to share research results.
It would have been much more promising for the security industry if this other tech could have shared directly the code he found with those that controlled that particular repository. Maybe my buddy could've had an answer much quicker than what it ended up taking. However, and much more to the point, a globally available repository would have alleviated a lot of the headache for my buddy. It would relieve that headache for a lot of people.
What do I propose? I propose that a massive undertaking begin, where everyone pitches in, but I get all the credit. :-)
Seriously though: Why not take ALL the data each time your employer or home network is attacked, convert it to an XML file, and upload the data for everyone else to see (minus internal IP/connection data, user names, etc.). The logistics are harder than I think this sounds to most people, but this could be done. I can already hear some screaming, "But then even more 'would-be' attackers (read: fat, pimply-faced scriptkiddies) would be able to attack us." I answer with the belief that if we (the network security industry as a whole!) were to immediately publish this data for all to see, we reduce the attack landscape currently available to the bad guys. Signatures can be written, rules enabled, and the good guys score. The immediate down-side to this is that the bad guys would have to (and they WOULD) get smarter. However, if we had 10 times as many examples of a particular exploit, wouldn't signature AND behavior based detection be much more feasable, and effective? Of course, I could just be ranting and raving becuase it's a Saturday and college football is over! :-(
I am getting off of the soap-box in favor of doing some work and listening to one of my Prof's, and seeing if the Tiger's have made any more moves in preperation for Spring Training. :-)
Subscribe to:
Posts (Atom)