Welcome to elreno.org

Software copyright Software Copyright Difficult to Enforce For those of you who love computer games, you probably know more about software copyright than you ever thought you'd want to know-especially if you have or have ever owned multiple computers. Most new games not only come with special copyrights but also built in security features that are designed to enforce those copyrights. Some have even gone so far as selling you the right to 'use' the material you are purchasing rather than providing you with actual ownership of the software to which they own the software copyright. That bothered me a bit at first, but I've come to understand it's another way of protecting them and their rights as well as controlling or limiting how you use the software they provide. Software copyright is actually quite confusing and hotly debated. Many stores will not accept opened software as returns because the software companies won't reimburse them for the product and they are left holding the bag. It doesn't sound like much but when you think of literally thousands of consumers attempting to return opened software because they didn't like or worse, they only needed to download and install it for it to actually run. Companies that produce computer software have become savvy to the ways of the modern consumer. Those companies that produce computer games especially require that the disk actually be in your player in order for the game to operate properly. This enforced the software copyright to the extent that two people can't reasonably share ownership of the same game, as they both need an actual disk in order to operate the games. But for every solution there is a hacker or budding programmer that creates a new problem for software makers and holders of software copyright to face. One of the latest problems is the virtual CD. The long and short of this is that the computer is tricked into 'seeing' the CD where it should be and carries out the game as though it were. Another important thing to note about software copyright is that there are many programs available that mimic some of the more notable applications for no fee. These are often referred to as open source software and often have excellent if not superior quality to similar programs that are available for fees. One thing I've noticed is that I will often find free open source software, download it, love it and a few months later I will find a more polished version of the same software, by the same company available with a few more bells and whistles for a fee. The new improved software has a software copyright and is not free to consumers but it is also a much better version than what I currently have. It's a great way for new software developers to make names for themselves and get volunteers for the testing process of their development phase. A software copyright offers protection and recognition to the owner of the software. The problem with protecting software is that it is impossible to police properly. That would require walking into every home on the planet and checking each computer to make sure there are no duplicate copies extra copies, illegal copies, etc. Plus, who keeps the actual boxes from all their software? I certainly do not. I could never prove that I was honoring the software copyright if the packaging or receipts were the only way I have of doing so. Most people in the world today honestly want to do the right thing. Software is one of the most expensive purchases people will often make for their home computers, it only makes sense to buy actual copies that have an actual software copyright in order to protect your investment not only in your software but also in your computer.

Web Hosting - Why Backups Are Essential One thing most web site owners have little time for is... anything! Anything other than focusing on their site content and the business or service it supports and the information it provides, that is. That means that administration often suffers, as it frequently must. There's only so much time in the day. But the one thing that you should never let slide are backups. They are like insurance. You rarely need it (you hope), but when you do you need it very badly. Performing regular backups - and testing them - doesn't have to be a nightmare. A little bit of forethought and effort and they can be automated to a high degree. And, they should be tested from time to time. Even when a backup appears to have gone without a hitch, the only way to know whether it's of any value is to attempt to restore the information. If it can't be restored, the backup is worthless. Even when the web hosting company provides the service, there is still some planning involved for the site owner. Hosting companies often rely on one or both of two methods. They backup everything (called a full backup), then backup anything which has changed since the last full backup (called an incremental backup). Of special interest are any configuration files that have been tailored. If you've modified the default installation of a software package, you want to be able to recapture or reproduce those changes without starting from scratch. Network configuration files, modifications to basic HTML files, CSS style sheets and others fall into the same category. If you have XML files, databases, spreadsheets or other files that carry product or subscriber information - about items purchased, for example, or people who signed up for a newsletter - those should get special attention, too. That's the lifeblood of your business or service. Lose them and you must start over. That can break your site permanently. It should go without saying that all HTML and related web site files that comprise visible pages should be backed up regularly. It isn't necessary to record every trivial change, but you can tailor backup software to exclude files or folders. Usually they're so small it isn't worth the trouble. But in some cases those small changes can add up in scenarios where there are many thousands of them. Here again, the backups are worthless if they can't be used. Even if the hosting company charges for doing so, it's worthwhile to test once or twice a year at least to ensure the data can be restored. That's especially true of database backups, which often involve special software and routines. Database files have a special structure and the information is related in certain ways that require backups be done differently. Developing a backup strategy can be straightforward. Start simply and review your plan from time to time, modifying it as your site changes and grows. But don't neglect the subject entirely. The day will come when a hard drive fails, or you get hacked or attacked by a virus, or you accidentally delete something important. When that day comes, the few minutes or hours you spent developing and executing a backup plan will have saved you days or weeks of effort.

Web Hosting - Sharing A Server – Things To Think About You can often get a substantial discount off web hosting fees by sharing a server with other sites. Or, you may have multiple sites of your own on the same system. But, just as sharing a house can have benefits and drawbacks, so too with a server. The first consideration is availability. Shared servers get re-booted more often than stand alone systems. That can happen for multiple reasons. Another site's software may produce a problem or make a change that requires a re-boot. While that's less common on Unix-based systems than on Windows, it still happens. Be prepared for more scheduled and unplanned outages when you share a server. Load is the next, and more obvious, issue. A single pickup truck can only haul so much weight. If the truck is already half-loaded with someone else's rocks, it will not haul yours as easily. Most websites are fairly static. A reader hits a page, then spends some time skimming it before loading another. During that time, the server has capacity to satisfy other requests without affecting you. All the shared resources - CPU, memory, disks, network and other components - can easily handle multiple users (up to a point). But all servers have inherent capacity limitations. The component that processes software instructions (the CPU) can only do so much. Most large servers will have more than one (some as many as 16), but there are still limits to what they can do. The more requests they receive, the busier they are. At a certain point, your software request (such as accessing a website page) has to wait a bit. Memory on a server functions in a similar way. It's a shared resource on the server and there is only so much of it. As it gets used up, the system lets one process use some, then another, in turn. But sharing that resource causes delays. The more requests there are, the longer the delays. You may experience that as waiting for a page to appear in the browser or a file to download. Bottlenecks can appear in other places outside, but connected to, the server itself. Network components get shared among multiple users along with everything else. And, as with those others, the more requests there are (and the longer they tie them up) the longer the delays you notice. The only way to get an objective look at whether a server and the connected network have enough capacity is to measure and test. All systems are capable of reporting how much of what is being used. Most can compile that information into some form of statistical report. Reviewing that data allows for a rational assessment of how much capacity is being used and how much is still available. It also allows a knowledgeable person to make projections of how much more sharing is possible with what level of impact. Request that information and, if necessary, get help in interpreting it. Then you can make a cost-benefit decision based on fact.