Saturday, April 18, 2009

Securing Your Computing Environment With Strict Passwords

A strong password really helps in securing your computing environment. In modern computerized working environment, you need to choose different passwords at different places or devices to implement system and network security. The password restricts unauthorized access to the system, network and specific devices secured with a password.

However, it will be a surprising fact for people with less IT knowledge (or just working knowledge of computing) that common passwords are only for novice people. Hackers and IT experts with sound knowledge of network security can break normal passwords within minutes. Even best firewalls and security hardware cannot protect you from brilliant and destructive mind of sophisticated cybercriminals.

Well the question arises, do we have any solution to keep our home of office systems, networks and devices secure from any such intrusion? The answer is YES. But, that is not possible through any software or hardware, because any security software or hardware is secure only for some time. Hackers and sniffers find out the way to break this security implementation very soon. The only solution is choosing a strong password for you. Well, choosing a strong password that is tough to guess, break or crack is not too tough, but tricky.

You need to follow some very old rules and guidelines for selection of a perfect password. Normally, people choose a convenient password for them that is easy to remember for them but tough to guess to others. A common password selected by a normal people can be the name of his/her spouse, kids or the place where they live; sometime people choose their date of birth or anniversary as password; some smart people make a combination of both. However, all these strategies are unsecure for you.

Always remember, that hackers don't only rely on guessing a password. They believe in using several strong password-cracking programs to crack your password. These smart and destructive programs can try to match all words of a dictionary as your password within minutes. So you will end up losing your password if it’s a word found in dictionary. So it's better to keep your smartness aside and follow these simple guidelines:
1. Never use a word from dictionary as your password,
2. Don't think that a word or phrase with multiple characters is an ideal password,
3. Use a strong combination of alphabets, numbers and special characters as your password,
4. Keep your password secret, never share password with anyone,
5. It's better to choose a passphrase as your password, and at last
6. Keep changing your password periodically.

Following the above guidelines can save you from password theft and provide you a secure working environment.

By: Dan Stratton

Article Directory: http://www.articledashboard.com

Safe Harbour's IT services are designed to dramatically reduce or eliminate computer problems in your business while maximizing your network's speed, performance, and stability, without the expense of a full-time IT staff. For More Information Visit: - www.safe-harbour.ca/

Printer - A Need Of Every Computer

A printer is a revolutionary birth in the world of computing. It is a device, which generates a hard copy of the documents stored in electronic form also known as soft copy. It prints the data on physical print media such as paper or transparencies.

Printers are intended for low-volume printing jobs, which necessitate very less time to get a hard copy of a given document. Printers are sluggish devices and the cost per page is reasonably soaring. Because of improving quality and performance of printers, many jobs that were used to be done by professionals in print shops are now being done by people on their local printers.

Most of the new printers come with USB cable, which serves as a document source. Many printers are principally used as local peripherals, and are attached by a printer cable. Printers that have built-in network interfaces and can serve as a device for hardcopy for any user on the network are commonly known as network printers. Printers are often designed to keep up both local and network connected users at the same time.

Modern printers can directly connect through electronic media such as memory sticks or memory cards, or to devices such as digital cameras, scanners with the help of USB cables and Bluetooth. Printers that include non-printing features are sometimes called Multifunction Printers (MFP) or All-In-One (AIO) printers. Most MFPs include printing, scanning, copying and the facility of fax among their features.

In the early years, the speed of printers was measured in units of characters per second. However, in modern era speed of printers is measured in pages per minute (ppm). The speed of printing 30 pages per minute is considered as a fast speed but there are many inexpensive printers whose speed is far slower than this. Laser printer is the fastest printer which also comes in different variants of its speed. However, its speed is faster than all other printers.

There are various types of printers such as monochrome printer, which can only produce an image consisting of one colour, usually black. A color printer, which can produce images of multiple colors. A photo printer, which can produce images of high resolution with multiple colours.

The world's first computer printer was 19th century mechanically driven equipment invented by Charles Babbage. Now there are many companies, which are providing low-priced printers with admirable performances and features. Such companies are Canon, HP, LG and Toshiba etc. to name few.

By: Tom Lopez

Article Directory: http://www.articledashboard.com

Authors suggest to you purchase best computer hardware like printers, desktop, laptops, pen drive, headphone, webcams, keywords, mouse, office products and many more from online shopping store.

Introduction To Windows Device Drivers

Device drivers are small files an operating system uses to communicate with various hardware devices connected to a computer. Rather than communicating directly with the hardware, the system sends messages through a driver, which calls out the functions for each device as it is needed. A device driver serves its purpose with little to no input from the end user. For example, once a driver is installed, it is likely that you will never have to interact with it. It simply works in the background until called upon by the operating system.

How Windows Device Drivers Work

The Windows operating system allocates memory for its default drivers. The drivers fill entries into what is called a Function Dispatch Table. Whenever functions of a particular device are required, Windows refers to the Function Dispatch Table and selects the accurate functions as needed.

For example, you want to use your Canon scanner to scan an image. Windows references the Function Dispatch Table for the code needed to carry out that function. The system then sends a request to the scanner. When a driver receives a request, it performs in one of three ways:

• It responds accordingly by carrying out the function and sends confirmation that the task is complete.

• If the device is busy with another task, it accepts the request and puts the job in a queue.

• It notifies the operating system that a problem exists and the job cannot be performed.

Most drivers operate in what is known as the kernel mode. The Windows system handles program codes in kernel or user mode. Processes running in kernel mode have the ability to interact directly with hardware and system memory. In contrast, user mode is often designated for software applications. This mode allows programs to call upon and use services delivered by the operating system, but they cannot directly access the hardware.

The Problems that Exist with Device Drivers

Similar to computer programs, device drivers and hardware have the ability to create numerous problems. Some of the most common issues include instability with particular programs, as well as the crashing of programs or the computer itself. In many cases, updating or completely replacing a device driver can solve hardware and software problems.

How to Update Windows Drivers

Microsoft’s Windows Hardware Quality Lab tests a variety of device drivers. When a driver passes a certain number of tests, it is then digitally signed and recognized by the Windows system. Users can also install unsigned drivers, depending on the setting of their computer’s Signature Checking Level. Signature checking can be disabled with Level 0, while Level 1 allows you to check for digital signatures and gives confirmation if none are found. You can, however, proceed with the installation even if no signatures are found for your drivers. Level 2 is a setting that blocks the installation of all devices that have not been signed by Microsoft.

To check the signatures of certain device drivers, access the “Start” menu, click “Run” and type “sigverif” in the command field. This command runs a scan of all the device drivers on your computer and searches for those that are unsigned.

By: Adam K Smith

Article Directory: http://www.articledashboard.com

Adam K Smith is an eminent author on Device Drivers for all types of Computers. To Safely Update All Your Drivers and Fix Your Drivers Problem Instantly Visit: www.fixyourdrivers.com/

Compare Netbooks

When netbooks first launched there was only really 1 type, and that was the Asus Eee PC netbook. It was tiny, simple, and very cheap. This netbook has actually been singled out as the netbook that started the whole craze and actually created a new market. However, since then almost every single manufacturer (Acer, Samsung, HP, Dell, Toshiba, LG, Lenovo, and MSI) have launched their own netbooks to try and capture some of this monumental market growth. The only significant manufacturer who has not yet created a dedicated netbook, is Apple, but that may also change.

Although so many manufacturers entering the market is a good thing as it creates competition ensuring that netbooks keep progressing technologically, as well as prices remain low, it has also lead to a lot of customer confusion as to how to choose a netbook. This is where comparing netbooks comes in.
Luckily most netbook manufacturers use a similar lingo or jargon when referring to netbook specifications and this makes comparing them a whole lot easier. When comparing netbooks, or anything for that matter, it is important to ensure that you are comparing like for like products and/or features. So the best thing to do when comparing netbooks, is to choose a set of features that are important to you and compare two, or multiple, netbooks along this set of features. Typical features to compare along include price, screen size, RAM, storage, operating system, and battery life.

You will notice that all the features above that are being compared are completely factual. It is very difficult, if not impossible, to compare along features such as the aesthetics, as this is purely subjective. One person may like a netbook with a sheen, while another will prefer matted. One may prefer a bezel around the screen, while another may not.

We hope that this article has made it easier for you to know the best features along which to compare netbooks are.

By: Marc123

Article Directory: http://www.articledashboard.com

Marc writes about how to compare netbooks and various Netbook deals on his site.

Know How To Fix Computer Problems

All organizations rely on computer to conduct business and operate efficiently. Computers are not just restricted to organizational or business purposes, but are widely and extensively used by general masses for carrying out various tasks. Although the knowledge of how to use a computer is quite common but a major portion of computer literates are ignorant of the computer problems. While computer problems are quite irritating and can affect the systems' performance, consistency and reliability solving computer problems are equally painful, time consuming and a costly affair. It is a dreadful experience to find that there is a problem with the computer and the data stored on it is in peril.

Computer problems can occur for a variety of reasons. The origin for computer problems could be either the users or some other uncontrollable parameters or a combination of both. There are some malicious programs called viruses that can get into the computer system through the internet or from sharing of storage media between computer systems that can infect the system. Besides these, there are some other factors like heat or magnetism, static electricity shocks or power surges, excessive accumulation of dust on the hardware or bumping or dropping the hard drive casing, incorrectly configured software or an incorrectly handled PC's setup, such as incorrect handling of upgrades etc can cause several computer problems.

When a PC system is encountered with a computer problem, its normal working is affected and its performance is compromised. The computer making quiet grumbling noises, the operating system taking too long to startup or shutdown, menus popping out, computer showing no signs of power at all, applications not working properly, frequent hanging of the system are some of the symptoms that signifies that the PC system is facing some computer problem. Computer problem could be either because of a fault with the hardware or a fault with the software.

There are some steps that can make sure that the computer system functions free of any error by fixing computer problems. The subsequent tips are a fundamental guideline that can be used to keep the PC maintained to run efficiently, increase component lifespan and also help in reducing the overall likelihood of PC problems. The number one reason for computer crashes is hardware conflict. Each hardware device communicates to other devices through an interrupt request channel (IRQ).

It is always a good habit to keep a backup of all the important data in case the original data is lost or gets corrupted; these data can be later reinstated after fixing computer problems. Having a good anti virus software can fix computer problems like fixing bugs and security threats. Anti virus programs should be updated regularly along side other software installed on the computer. Only registered software should be used which are less likely to crash or harm the computer system. The components like monitor, keyboard, mouse etc should be regularly cleaned. Computer should be kept away from any kind of magnetic field or near a heating or cooling source. One of the best ways to help protect the computer so that the need to fix computer never arise is to perform regular maintenance of the PC.

By: Ted Croushore

Article Directory: http://www.articledashboard.com

Is the computer running slowly and receiving frequent error messages? Just logon to Ted Croushore's www.codepaste.org where he has come up with new software, Registry Optimizer. He says it is good for restoring the speed and performance of the PC.

What Does Pc Hardware Include And What Pc Support Do

Hardware is the mechanical, magnetic, electronic and electrical components making up a computer system. Hardware comprises all of the physical parts of a computer, as distinguished from the data it contains or operates on, and the software that provides instructions for the hardware to accomplish tasks. PC Hardware is the assembled by the manufacturer, and any device that is connected to the computer and is controlled by the microprocessor of the computer.

Details about PC Hardware can be easily accessed in Device Manager, where a user can get the lists of all PC Hardware devices. A user can use Device Manager to update the drivers or software for hardware devices. A user logged on as administrator can enable or disable PC Hardware, can change the properties of any device using the device manager. PC Hardware settings can be modified and troubleshoot problems can be solved. The PC Hardware installed or connected with a computer, are grouped as type of resources. Direct Memory Access (DMA), Input /output (I/O), Interrupt Request (IRQ) and Memory are the various groups, under which related PC Hardware devices are enlisted.


PC Hardware profiles provide a way to set up and store different hardware configurations. Hardware Profiles suggest which driver to load when the available PC Hardware changes. PC hardware requires resources to properly operate with the computer and the software installed in the computer. Hardware compatibility tests implement the combination of a device, a software driver, and an operating system under controlled conditions to verify that all components operate properly.

Drivers are the group of files, which make possible for the PC Hardware device to communicate with the operating system of the computer. Every PC Hardware devices have particular driver files, with particular version and made for a particular Hardware model, without which the Hardware devices can not work with the computer. The drivers are required to be updated, which enable PC Hardware to function properly. New or different operating systems always ask for new versions of Hardware drivers. Updating the driver version not only makes the PC Hardware compatible with the latest operating systems, it also advances the stability of the computer and increases reliability of the PC Hardware.

PC support services are necessary for PC Hardware to work properly and efficiently. The Hardware incompatibility issues are no new or distinctive troubleshoots. The need of PC Support is hence very vital in the world of computers. Manufacturers, Hardware & Software professionals mainly provide the PC support services. PC Support services depend on the kind of issues a computer is having. PC Support services may be delivered by different technologies depending on the situation. The Executives of the PC Support service providers can be communicated through E-mail, Fax, Telephone or using Remote PC Services. Onsite services are provided by the PC Support to the users, in case of complicated PC Hardware problems.

Today, PC Hardware maintenance has become more cost effective and with coming up of remote PC Support services, it has become more convenient and cogitative, since these services are free policies. Customers do not have to pay if the issue is not resolved.

By: Ted Croushore

Article Directory: http://www.articledashboard.com

Is the computer running slowly and receiving frequent error messages? Just logon to Ted Croushore's www.codepaste.org where he has come up with new software, Registry Optimizer. He says it is good for restoring the speed and performance of the PC.

Know What Are Pc Tools And How Are They Useful In Pc System

The hardware and software along with a number of peripheral devices constitute a PC System. More specifically a PC system can be divided into four major elements. These elements are: the hardware, the application programs, the operating system and the users. The hardware like, hard disk drives, optical drives (CD or DVD drives), random access memory (RAM), motherboard, monitor, sound devices, keyboard, system bus; application software like word processors, spreadsheets, compilers, web browsers; operating system like windows or linux etc all work together to make a PC system function efficiently.

One of the most important components of a PC System is the Operating System (OS) without which getting anything meaningful out of the hardware would be a nightmare if not impossible. The Operating System controls and coordinates the use of hardware among the various application programs for various users. PC System which can be viewed as consisting of various resources like hardware, software and data, the operating system provides the means for proper utilization of these resources. The operating system is quite similar to a government. Like a government, it performs no useful function by itself instead it provides an environment within which other programs can do useful work.

User programs are executed by the PC system. In a PC system, the hardware and the software work in a synchronized manner. It gives a stable performance and reliability to the system. A PC system processes many modules at a same time. So, it is no idiosyncratic, that a PC system gets crashed. PC systems are always prone to vulnerable errors and faults, which, many a times jeopardize some valuable data and also the performance of the PC system can get affected. Numerous PC Tools are available which are bundled into the operating system, to cope up with these kinds of problems. These PC Tools are programs, which are designed to help the user to rectify the problems, by them.

PC tools can give a lot of valuable information about the state of PC System. This information can be in the form of graph, histogram or report. With the help of this information, problems if any exist, can be found and fixed. Over time when a user adds or removes software, devices and drives, the system will be left with extraneous system registry entries, which can lead to slower performance. A user can use PC Tools that can help to clean the registry entries and make the operating system work faster.

Disk Defragmenter is a PC tool which explores local volumes and merges fragmented files and folders. A PC runs better with regular disk defragmentation. Defragging the hard drive organizes the hard drive so that access to files and programs is more efficient. Other PC tools like Event Viewer, Shared Folders, Local Users and Groups, Performance Logs and Alerts, and Device Manager also help in managing the PC system performance.

By: Ted Croushore

Article Directory: http://www.articledashboard.com

Are you in search of a PC guide to remove the registry errors? Ted Croushore suggests you to give the computer a make over by scanning the PC with Registry optimizer and increase the PC performance at his website www.codepaste.org.

Types Of Desktop Memory Configurations

A computer memory is one that retains the data for the computer for a certain period of time. RAM, CD, DVD, Hard disk are all part and parcel of the computer system. These devices are used mainly for storing the data from the computer. Here again, all these items differ in the capacity and in the storage speed.

The CPU – Central Processing Unit will be connected to the main memory of the computer. The programmes and the data that are used currently or that are run currently in the computer will be stored in this main memory. But, the present era of computers do come with the RAM-Random Access Memory and a solid state memory. These devices are annexed to the CPU with the help of the memory bus. Memory Bus is otherwise called as the address bus.

In some computers, cache memory is available along with the RAM. This cache memory will possess minute chunks that are going to be used by the CPU immediately. This is installed mainly to cut short the time of fetching and in turn increase the working speed of the CPU. Thus the function of the CPU is increased by the working of the cache memory. Commonly, RAM is considered as the very vital part of the computer memory. It is made up of the integrated semi conductor chips.

The RAM can be compared with the other types of memory. The main feature that brings out RAM from the other kinds of memory is that RAM can be located from any location of the computer with the same speed. Some types of RAM chips are very capricious. Thus these RAM chips drop all the data when the power supply is off. Some of the computers are provided with the Shadow RAM which helps in copying the present data and makes the computers work very efficiently.

Earlier, the computer cost was on the very high side and hence the end users of the computer always desired to upgrade it rather then going in for a new one. But now the scenario is completely different due to the decrease in the rate of the hardware and the interest is increased to change the computers than upgrading them with the parts.

Exchange of computers is economical only in the case of home users and not in the case of high end computer users as replacement of servers is still found to be very costly. Hence for these cases, it is better to replace the computer with the desired spare parts to update the computer.

The memory is available in the market in different configurations. For laptops and desktop computers, it ranges from 128 MB to 1 GB. This capacity will be sufficient for the regular usage. More memory is required when the computer is equipped with more programmes. In these cases, it is not advisable to buy new computers and rather the same can be upgraded with the available slots for internal expansion.

By: Lesley Lyon

Article Directory: http://www.articledashboard.com

desktopmemory.info, helps us get the best memory solution for our desktop PC. Read reviews and compare prices of different types of memory options in the market. ramupgrade.info helps us find out how to improve your system's performance with the right RAM upgrade matching the computer's configuration.

Laptop Processor: Why Middle End Processors Are Most Preferred

The processor is considered as the heart of any computer which is listed first in all the computer market. The specification listing will indicate in the spec sheet about the model, brand and also the processor speed whereas on the other hand, the information with regard to marketing will highlight only the rating of the speed. Due to this fact, the customers or the buyers would find it very tough to predict the quality of the machine. It is a common fact that the processor that runs at a particular speed in a particular model will not maintain the same speed for the other models that are manufactured by the same producer. Let us now see about the various categories of the processors and their function.

Some of the processors will be considered as outdated due to the stoppage of their production and the prevailing stock will be sold for the economy systems or the refurbished old systems. These processors will take long time for running a programme or an application and in some cases this will not run the software available in the market presently. Hence, it is better to avoid purchase of the computers with this type of processors. Consideration can be given only if the system is required for the fundamental functions like web browsing or word processing.

Some of the processors are very economical with superior performance which the companies would have stopped production. These types of processors can be categorized into two. The first one is the older high end processors for which the production is stopped and the other one is the new and low end budget processors. The high end processors which are discontinued from the market will give a better result in terms of their function like a little lower clock speed and also the processor’s architecture allows it to perform well in most of the computing errands when compared to the new processors.

The performance of the middle end processors is superior and gives the desired result for the money spent. Even though, it is not considered as the very fast processor in the market, it’s performance in the all the aspects seems to be enormous. These processors do not have the life span as for the high end processors.

These processors lie on the top of all the processors and in particular when the question of processing power arises. A laptop can be rated as a superior one only when it is fitted with a good processor. The customer may have to shell out a little more money for purchase of these types of laptops. The processors that are manufactured and marketed today are of superior quality and rated as premium when compared to the middle end and low end processors. These processors even though the rate is nearly double the rate of low and middle end processor; the performance, when compared, is in the range of 25% to 50% higher.

By: Lesley Lyon

Article Directory: http://www.articledashboard.com

laptopprocessor.info helps us get the best processor for our laptop to improve its efficiency and speed. Read reviews and compare prices of all laptop processor brands here and make a wise choice. systemmemory.info lets us find the best memory solution for your systems. Read reviews and compare prices of different types of memory options in the market.

System Memory: Tips To Choose The Right One

As the house and other items like car are cleaned regularly, the cleaning process has to be done for the computer also to keep the memory strong. When the unwanted programs or the files are not cleaned up, it will occupy more space on the computer and in turn will affect the memory of the computer. Hence, regular cleaning process is warranted for any computers for maintaining the same in good condition.

In some cases, when a CD is put in the drive of the computer, it may not be read by the computer or it may take a long time to read the CD. This is mainly due to the problem in the memory of the computer. The memory is required to store the programs and should display the same on the screen when required by the user on the press of a switch. When there is delay in searching and display, then the computer has to be repaired for the memory.

The correct type of memory has to be installed in the computer to enable the computer run much faster than the usual speed. The correct memory on the computer will display all the required programmes quickly without any delay. There is no question of doubt regarding the truth that the computers and the laptops have turn out to be very complicated and stylish nowadays and the necessity may crop up for memory upgrade, anytime.

The memory plays a vital role in the computers and once the memory of the computer is corrupted, all the data and program stored in the computer will be lost. So, the memory maintenance should be given utmost priority. The storage capacity of the computer mainly depends on the memory capacity of the computer.

The memory of the computer can be upgraded by just replacing a slide depending on the requirement. Different capacities of memory slides are available like 512MB, 1GB, and 2GB and so on. These memory slides can be purchased from the local shops and can be fixed on the computers by the users themselves. It is not a difficult task and one who has knowledge to open the CPU can do it by just fixing the memory slide in the concerned slot. The slides are to be handled very carefully to avoid damages.

By: Lesley Lyon

Article Directory: http://www.articledashboard.com

laptopprocessor.info helps us get the best processor for our laptop to improve its efficiency and speed. Read reviews and compare prices of all laptop processor brands here and make a wise choice. systemmemory.info lets us find the best memory solution for your systems. Read reviews and compare prices of different types of memory options in the market.

Hard To Find Computer Parts

Finding computer parts can take you to some interesting places. If your are lucky you can just pick up the phone, call and pay someone who has already done the necessary leg work but if not your are going to have to do it yourself.

Don't you just love that term, Do it yourself. Buying up old computers is one way to get the computer parts you need but there is no guarantee the part you are looking for is going to be in the best condition or that it will not be out dated. To add insult to injury the part may not be there, although there are times you can find some computers in good shape a great computer source for a hobby.

Combing the thrift stores is another way to find computer parts. A lot of the computers are much older models but in pretty good shape.

What you get out of the older ones really depends upon what you want to do. If you are into state of the art equipment these computer parts will not do for you at all. As odd as it might sound if you have a Saturday afternoon to just ride around you will be surprised how many computers people just sit on the curb, after all you are only lookingfor parts.

But, that is enough of the pull yourself up by your own bootstrap talk.. Your business is busy and you need a reliable computer parts source to keep it up and running; and you need your computer parts to be reliable.

There is the favorite place to look, online. Go to a reputable online computer parts store, type in your computer parts code and find what you need. Online storeshave a very sophisticated line of computer parts and enough to take care of your business needs.

The parts are of high quality and guaranteed to work. Power supplies, IO Cards, switch cables, and the list goes on. You won't need to guess, and you will not go through the trauma of tearingdown a whole computer just to get an individual computer part that may not work, or may not be there, doesn't that just warm your heart. But then you may just have the time. Don't be a tough guy, go ahead and save yourself the aggravation and let the professionals to get you the computer parts you need from a computer parts source that you can count on. The shipping is easy, your parts can be on the way and you can go on to the more pressing things concerning your business.

Everything is fast today in the business world and you must find fast ways to keep your business moving. It is necessary to do business with a company that understands time is money and you need reliable service as well as a great product.

By: Kevin Bright

Article Directory: http://www.articledashboard.com

Kevin Bright - www.microbite.co.uk/ Microbite Prides itself on being able to source ALL computer parts for both new and older systems. We have a massive searchable database of part codes on our website - Global Delivery - Call Now!

Ram Upgrade: The Cost Effective Solutions

Some people may have very old version of computers which may not run the programs that are floated now. But the necessity may insist them to go in to update the computers to run all the software but the finance may obstruct them from purchase of new computers. Under these circumstances, it is advisable to update their computers by doing the necessary repairs and inclusions rather than throwing the old ones.

When the RAM in the system is upgraded, it is for sure that a lot of benefits can be obtained by the end users. The start up time of the computer will be minimized and the loading of the programs will be done much faster. Thus, when the user upgrades the RAM of the computer, he can utilize the old computer as the new one without spending much money as well as time. He can proceed with the upload of new software that will not hinder his smooth functioning. The RAM is also available in different capacities and the user has to choose the required capacity depending upon the need. If more programmes have to be loaded on the computer, then it is necessary to upgrade the computer with the higher capacity to make the system run faster when all the programmes are used simultaneously.

It is not that quite easy task to find the exact RAM for the computer and that too with the good price. A little home work is required to achieve the same. RAM can be purchased through online which will save money due to the non entry of middle man who generally takes commission for the sale, which is avoided in the online transaction. The time is also saved when the purchase is made online as driving to various shops is avoided in this procedure.

On browsing online, many varieties of RAM upgrade can be seen. The exact RAM that will suit the computer has to be chosen for the proper functioning of the computer. Hence to save time and money to upgrade the RAM, it is a must to browse the sites and select the required RAM and then decide on the purchase.

Some online companies do offer despatch of RAM to the required address with free delivery. Some companies do give more discounts for a bulk order.

By: Lesley Lyon

Article Directory: http://www.articledashboard.com

desktopmemory.info, helps us get the best memory solution for our desktop PC. Read reviews and compare prices of different types of memory options in the market. ramupgrade.info helps us find out how to improve your system's performance with the right RAM upgrade matching the computer's configuration.

Ensuring Security Of Wireless Networks

Nowadays, implementation of wireless networks is very common. There are very less clutters in a wireless network, so it is a convenient way of network implementation and management. The troubleshooting of wireless network is easier than wired networks, so people prefer having wireless networks at workplace as well as at home.

However, wireless networks are more vulnerable to security flaws along with the convenience and easy approach of implementation. Even a person with less IT knowledge can easily access an unsecured wireless network and use unethically. Therefore, it is very important to restrict unauthorized access of wireless networks available at home or at your workplace. There are some standard security measures of securing your wireless networks from unauthorized access. In order to secure your wireless network, you can adopt following actions:

Restrict Wireless Network Broadcasting
The default setting of your Wi-Fi router allows automatic network broadcasting so that devices with wireless access feature can detect the wireless networks available in range. Choosing this default setting makes your wireless network open to everybody. To restrict automatic wireless network discovery you can disable this feature. Go through your wireless router manual to learn how to disable this feature.

Enable Data Encryption
Data Encryption is a well-accepted protocol to secure wireless networks. Nowadays, almost every Wi-Fi router or access points come with WEP (Wired Equivalent Privacy) or WPA (Wi-Fi protected access) encryption schemes. By enabling any of these two encryption schemes, you can restrict the access of your Wi-Fi network.

Choosing Strong Network Password
While enabling data encryption, you are required to set a password to allow access to your wireless network. Choosing a strong password is very important to achieve required level of security. An ideal password is combination of alphanumeric keys and comprise of several characters. Avoid using your name, Date of Birth, or other common things as a password for your wireless network.

Activating Firewall
All wireless access points come with an in-built firewall to stop unauthorized incoming and outgoing connections through your wireless network. Learn how to use and customize this firewall for maximum level of wireless network security.

By following above instructions, you can secure wireless networks at your home or at workplace and enjoy the benefits of going wireless without any worries.

By: Dan Stratton

Article Directory: http://www.articledashboard.com

Safe Harbour's IT services are designed to dramatically reduce or eliminate computer problems in your business while maximizing your network's speed, performance, and stability, without the expense of a full-time IT staff. For More Information Visit: - www.safe-harbour.ca/

Network Security Management Services

Network security is to secure both public and private computer networks, used every day to conduct transactions among businesses and individuals. Any business related to IT network system needs to make sure to establish a strong, secure network for their data and systems. There is an increasing need to secure your networks within organizations. To achieve network security, all requirements have to be met to use networks securely.

Organizations spend a large amount of their business on IT network security. It is necessary that networks themselves have the appropriate levels of security. An effective and valuable network security strategy requires identifying the threats and choosing most effective tools to struggle them. Email security management and Antivirus security are effective services in keeping a critical data and communications safe from intruders or attacks and other threats to security.

Email security management

Email viruses through harmful attachments in the emails can reach your system and infect it. Email security management helps to stop unwanted materials and reduce spam in emails. It also provide message tracking capabilities in order to follow e-mails for troubleshooting and auditing purposes. It also helps in examining the security threats facing your corporate email system. Email security management gives you the reliable email security performance and safeguards your important emails against all threats.

Antivirus security

With an increase in attacks and viruses on the internet, antivirus security software programs have become a need of every hour. Antivirus security software makes your online surfing, searching and chatting safe. Antivirus security protects your business networks from web threats such as viruses, spyware and all types of malware that can threaten your valuable personal information.

Benefits of Network Security Management

There are number of important benefits of purchasing network security management services as it better to safe your networks than any damage by deadly viruses and attacks.

· It improves IT security and effectively manage all network security program.
· Disclose any weakness in your network, server and desktop infrastructure.
· Identify the solutions to integrate the networks within existing environments.
· Network Security provider also supply firewall with reputation-based global intelligence. Firewall blocks traffic coming from and going to the internet.
· Make safer, easier and more convenient for computer users to access their network from remote locations.
· Helps in enhancing system security for sensitive data.
· Regularly audits security efforts with comprehensive system.
· Without network security, anybody can hack files or data from the organization network.
· It reduces overall information security risk.

Select best Network Security Management Services

To avoid threats, even small and medium sized business prefers to get managed network security services. Without reliable network management services, company would find it difficult to prevent any attack from happening. With the help of reliable and cost effective network management services, you can safeguard your network against such attacks and malicious intrusions. While choosing network security management services, you need to look for several capability areas. First, should select custom made solutions for your business. Second, network management services should maintain the integrity of your network. Third, should be able to provide 24/7 technical and troubleshooting support. The company or business can surely benefit from expertise and solutions of network management services.

Fiverivers provide cost effective network security management so that you can budget your protection and productivity with no surprises. Our network security management identifies the most critical information assets and network nodes. With our security service you no longer have to worry about any type of threats or attacks. We provide 24/7 network security management services.

By: Smith Jolene

Article Directory: http://www.articledashboard.com

Author info:
Smith Jolene is writing articles for fiveriverssupport that provide Network security management to secure your systems network. This includes Email Security management services and Antivirus security service.

How Secure Is Your It Network?

Nowadays, just about all major businesses and corporations are powered by one or more IT networks and continuously serves as the life and blood of day to day business operations. Each of the units or nodes of the network are powered by computers and individuals that work together in their dedicated positions and fulfil their roles assigned. IT networks are so delicate since any disturbance in the network can either hinder or slow down the overall network performance to a certain degree. Not only must the network administrators do their jobs in preventing these issues from happening, but they must also evaluate how secure their IT networks are.

What about your network?
Have you ever considered doing a security audit for your network?
Just because your network has never bumped into any problems doesn't mean that the security is at its maximum. Therefore it is important to test your computer for any network vulnerabilities and stay updated with the current trends to see if new network vulnerabilities are discovered that may pose a threat to your network one day.

There will always be people that are up to no good in developing hacking programs and viruses that are specifically meant to disrupt or harm networks directly. These methods may involve data theft, data corruption, or a combination of the two and for the average business this can be a serious threat to the wellbeing of the network and business.

Security software and services are one of the allies of these networks that aim to hinder these rogue software and stay one step ahead of the threats to prevent IT networks from suffering at all. The payload of these vulnerabilities can sometimes be irreversible like if data is stolen so preventative measures must be forced, especially in businesses that have highly sensitive data in their servers.

Resorting to the various security software and services shouldn't be the only option, as they may not have defence to the newest threats out there right away. By making a habit to do your own network security maintenances, you can keep these new threats at bay and wait for your network security programs to update.

Common things to protect are sensitive files and entry web pages of the network. Passwords are a good first line defence to restrict access to the individuals within the network who know the password. Making the password complicated can reduce the risks of brute force attacks - a common method in breaking through password protected areas. Using the latest system software from operating system to other programs that have network access can also reduce the chances of a security breech since patches aim to plug in security holes.

Always take some time off to check your networks security. If you have no knowledge on how to do it, you can always ask a skilled hacker to try and break into your network. Former hackers are the best people to consult as these people dedicate themselves to helping personnel and using their previous hacking to improve the security of networks. This will allow your network to run smoothly without fear of a network crisis.

By: Derek Rogers

Article Directory: http://www.articledashboard.com

Derek Rogers is a freelance writer who writes for a number of UK businesses. For Network Security, he recommends Network 24, a leading network security solution company.

Internetworking

Internetworking involves connecting two or more computer networks via gateways using a common routing technology. The result is called an internetwork (often shortened to internet).

The most notable example of internetworking is the Internet (capitalized), a network of networks based on many underlying hardware technologies, but unified by an internetworking protocol standard, the Internet Protocol Suite (TCP/IP).

The network elements used to connect individual networks are known as routers, but were originally called gateways, a term that was deprecated in this context, due to confusion with functionally different devices using the same name.

The interconnection of networks with bridges (link layer devices) is sometimes incorrectly termed "internetworking", but the resulting system is simply a larger, single subnetwork, and no internetworking protocol (such as IP) is required to traverse it. However, a single computer network may be converted into an internetwork by dividing the network into segments and then adding routers between the segments.

The original term for an internetwork was catenet. Internetworking started as a way to connect disparate types of networking technology, but it became widespread through the developing need to connect two or more local area networks via some sort of wide area network. The definition now includes the connection of other types of computer networks such as personal area networks.
protocol, such as The Internet Protocol is designed to provide an unreliable (i.e., not guaranteed) packet service across the network. The architecture avoids intermediate network elements maintaining any state of the network. Instead, this function is assigned to the endpoints of each communication session. To transfer data reliably, applications must utilize an appropriate Transport LayerTransmission Control Protocol (TCP), which provides a reliable stream. Some applications use a simpler, connection-less transport protocol, User Datagram Protocol (UDP), for tasks which do not require reliable delivery of data or that require real-time service, such as video streaming ...... more click here
-------------------------------------------------------------------------------------------------

7-layer OSI MODEL

The OSI (Open System Interconnection) model is developed by ISO in 1984 to provide a reference model for the complex aspects related to network communication. It divides the different functions and services provided by network hardware and software in 7 layers. This facilitates modular engineering, simplifies teaching and learning network technologies, helps to isolate problems and allows vendors to focus on just the layer(s) in which their hardware or software is implemented and be able to create products that are compatible, standardized and interoperable.

The diagram below shows the 7 layers of the OSI Model, to remember them in the correct order a common mnemonic is often used: All People Seem To Need Data Processing.

Host A Host B

The Application, Presentation and Session layer are known as the Upper Layer and are implemented in software. The Transport and Network layer are mainly concerned with protocols for delivery and routing of packets to a destination and are implemented in software as well. The Data Link is implemented in hard- and software and the Physical layer is implemented in hardware only, hence its name. These last two layers define LAN and WAN specifications. more click here

By: ahamed

Article Directory: http://www.articledashboard.com

cisco-training640-802.blogspot.com/

How To Test Your Broadband Speed

It is very important that you know how to test your broadband speed and do it on a regular basis. Think about all those times when you have wanted to hit your PC because a page that you were visiting was taking way too long to load and you automatically blamed it on the webmaster.

How do you honestly know it was there fault?

It’s just an assumption, one that I have made more times than I can remember. But I am sure that I not unique with this thought. In England for example where I live there are many, many different Broadband providers and they all have a different reputation, some bad, some good and most of the time they are deserved. The bigger companies in general offer the worst service as they simply have too much to deal with and the quality of the product then suffers as a result.

I am not hear to name and shame as that wouldn’t be fair as I am not here today to write a review on them. But I do want to point out to you the importance of testing your broadband speed and if you don’t you will be left wondering if you are getting the best service on offer. If you do complete a test on your Broadband you may even find that you can get to the bottom of any problems you have and improve the quality of your broadband experience.

One of the key reasons why your broadband speed may have been affected is due to living in a newly built house. When these properties are being developed the signal strength on the phone line can be weaker than with an older property this will then cause you to lose part of your speed. Of course there are many other reasons but this is just a good example.

Most of us have very little IT knowledge and just assume that we turn on the computer and hey presto we launch Internet Explorer but it’s a lot more to it than that. For starters the way your speed works is by how fast the data can transfer from one computer on the internet to your own. So obviously you want to the best speed possible.

To test your broadband speed is very easy indeed. All you need is a simple tester and you can find out EXACTLY how good your connection is. You can then deal with it if you have a problem or be thankful that yours is working perfectly.

By: Samantha Milner

Article Directory: http://www.articledashboard.com

"Let Jason Slater show you how to become an IT Pro Today!"

Head over to www.jasonslater.co.uk/tools/broadband-speed-tester/ to have your very own on screen Broadband Test carried out!

Linux Operating System Errors And Resolution

Linux is the most popular operating system so far as security and performance is concerned .It is a Unix-like operating system and is the best example of open source development and free software; generally all underlying Linux source code can be modified, redistributed and used freely by anyone. Red Hat Enterprise Linux 5.2 is the latest version of Linux operating system.

As we all know every operating system has its demerits or failures and can give errors at any time with out any prior information. There are some common error messages faced by Linux users while using this operating system. Some of the most common ones are the following:

• Unknown terminal type Linux.

• Unrecognized option '-m486'.

• bdflush not running

• cannot read table of mounted file systems

• Unknown terminal type Linux.

• Cannot initialize drive XYZ?

• EPERM Operation not permitted.

• Unrecognized option '-m486'.

• Modprobe can't locate module, ``XXX,'' and similar messages.

• Mounting unchecked file system.

• EINTR Interrupted system call.

In most the situations given above a user faces any of the mentioned problems, it becomes difficult for him/her to access or manage his data. He/she won’t be able to do normal tasks which can otherwise be performed effortlessly. If the data which has been rendered inaccessible is important then it becomes essential to recover that data as soon as possible. In this case, he/she would need the help of Linux data recovery software.

Linux Data Recovery Software is an ultimate way out for all Linux related data recovery problems. It also recovers data from formatted hard drives also where file systems have been changed. It recovers lost logical drives and data from physical disk or any removable media.

Stellar Phoenix Linux Recovery software provides data recovery from Ext2, Ext2 and Reiser FS file systems of Linux operating system.

By: Allen Sood

Article Directory: http://www.articledashboard.com

Allen Sood a student of Mass Communication doing research on data recovery and linux recovery software. He is also a freelancer for www.stellarinfo.com

FOUNDATION OF SIDEREAL ASTRONOMY

Until nearly a hundred years ago the stars were regarded by practical
astronomers mainly as a number of convenient fixed points by which the
motions of the various members of the solar system could be determined
and compared. Their recognised function, in fact, was that of milestones
on the great celestial highway traversed by the planets, as well as on
the byways of space occasionally pursued by comets. Not that curiosity
as to their nature, and even conjecture as to their origin, were at any
period absent. Both were from time to time powerfully stimulated by the
appearance of startling novelties in a region described by philosophers
as "incorruptible," or exempt from change. The catalogue of Hipparchus
probably, and certainly that of Tycho Brahe, some seventeen centuries
later, owed each its origin to the temporary blaze of a new star. The
general aspect of the skies was thus (however imperfectly) recorded from
age to age, and with improved appliances the enumeration was rendered
more and more accurate and complete; but the secrets of the stellar
sphere remained inviolate.

In a qualified though very real sense, Sir William Herschel may be
called the Founder of Sidereal Astronomy. Before his time some curious
facts had been noted, and some ingenious speculations hazarded,
regarding the condition of the stars, but not even the rudiments of
systematic knowledge had been acquired. The facts ascertained can be
summed up in a very few sentences.

Giordano Bruno was the first to set the suns of space in motion; but in
imagination only. His daring surmise was, however, confirmed in 1718,
when Halley announced[3] that Sirius, Aldebaran, Betelgeux, and Arcturus
had unmistakably shifted their quarters in the sky since Ptolemy
assigned their places in his catalogue. A similar conclusion was reached
by J. Cassini in 1738, from a comparison of his own observations with
those made at Cayenne by Richer in 1672; and Tobias Mayer drew up in
1756 a list showing the direction and amount of about fifty-seven proper
motions,[4] founded on star-places determined by Olaus Römer fifty years
previously. Thus the stars were no longer regarded as "fixed," but the
question remained whether the movements perceived were real or only
apparent; and this it was not yet found possible to answer. Already, in
the previous century, the ingenious Robert Hooke had suggested an
"alteration of the very system of the sun,"[5] to account for certain
suspected changes in stellar positions; Bradley in 1748, and Lambert in
1761, pointed out that such apparent displacements (by that time well
ascertained) were in all probability a combined effect of motions both
of sun and stars; and Mayer actually attempted the analysis, but without
result.

On the 13th of August, 1596, David Fabricius, an unprofessional
astronomer in East Friesland, saw in the neck of the Whale a star of the
third magnitude, which by October had disappeared. It was, nevertheless,
visible in 1603, when Bayer marked it in his catalogue with the Greek
letter Omicron, and was watched, in 1638-39, through its phases of
brightening and apparent extinction by a Dutch professor named
Holwarda.[6] From Hevelius this first-known periodical star received the
name of "Mira," or the Wonderful, and Boulliaud in 1667 fixed the length
of its cycle of change at 334 days. It was not a solitary instance. A
star in the Swan was perceived by Janson in 1600 to show fluctuations of
light, and Montanari found in 1669 that Algol in Perseus shared the same
peculiarity to a marked degree. Altogether the class embraced in 1782
half-a-dozen members. When it is added that a few star-couples had been
noted in singularly, but it was supposed accidentally, close
juxtaposition, and that the failure of repeated attempts to measure
stellar parallaxes pointed to distances _at least_ 400,000 times that of
the earth from the sun,[7] the picture of sidereal science, when the
last quarter of the eighteenth century began, is practically complete.
It included three items of information: that the stars have motions,
real or apparent; that they are immeasurably remote; and that a few
shine with a periodically variable light. Nor were these scantily
collected facts ordered into any promise of further development. They
lay at once isolated and confused before the inquirer. They needed to be
both multiplied and marshalled, and it seemed as if centuries of patient
toil must elapse before any reliable conclusions could be derived from
them. The sidereal world was thus the recognised domain of far-reaching
speculations, which remained wholly uncramped by systematic research
until Herschel entered upon his career as an observer of the heavens.

The greatest of modern astronomers was born at Hanover, November 15,
1738. He was the fourth child of Isaac Herschel, a hautboy-player in the
band of the Hanoverian Guard, and was early trained to follow his
father's profession. On the termination, however, of the disastrous
campaign of 1757, his parents removed him from the regiment, there is
reason to believe, in a somewhat unceremonious manner. Technically,
indeed, he incurred the penalties of desertion, remitted--according to
the Duke of Sussex's statement to Sir George Airy--by a formal pardon
handed to him personally by George III. on his presentation in 1782.[8]
At the age of nineteen, then, his military service having lasted four
years, he came to England to seek his fortune. Of the life of struggle
and privation which ensued little is known beyond the circumstances that
in 1760 he was engaged in training the regimental band of the Durham
Militia, and that in 1765 he was appointed organist at Halifax. In the
following year he removed to Bath as oboist in Linley's orchestra, and
in October 1767 was promoted to the post of organist in the Octagon
Chapel. The tide of prosperity now began to flow for him. The most
brilliant and modish society in England was at that time to be met at
Bath, and the young Hanoverian quickly found himself a favourite and the
fashion in it. Engagements multiplied upon him. He became director of
the public concerts; he conducted oratorios, engaged singers, organised
rehearsals, composed anthems, chants, choral services, besides
undertaking private tuitions, at times amounting to thirty-five or even
thirty-eight lessons a week. He in fact personified the musical activity
of a place then eminently and energetically musical.

But these multifarious avocations did not take up the whole of his
thoughts. His education, notwithstanding the poverty of his family, had
not been neglected, and he had always greedily assimilated every kind of
knowledge that came in his way. Now that he was a busy and a prosperous
man, it might have been expected that he would run on in the deep
professional groove laid down for him. On the contrary, his passion for
learning seemed to increase with the diminution of the time available
for its gratification. He studied Italian, Greek, mathematics;
Maclaurin's Fluxions served to "unbend his mind"; Smith's Harmonics and
Optics and Ferguson's Astronomy were the nightly companions of his
pillow. What he read stimulated without satisfying his intellect. He
desired not only to know, but to discover. In 1772 he hired a small
telescope, and through it caught a preliminary glimpse of the rich and
varied fields in which for so many years he was to expatiate.
Henceforward the purpose of his life was fixed: it was to obtain "a
knowledge of the construction of the heavens";[9] and this sublime
ambition he cherished to the end.

A more powerful instrument was the first desideratum; and here his
mechanical genius came to his aid. Having purchased the apparatus of a
Quaker optician, he set about the manufacture of specula with a zeal
which seemed to anticipate the wonders they were to disclose to him. It
was not until fifteen years later that his grinding and polishing
machines were invented, so the work had at that time to be entirely done
by hand. During this tedious and laborious process (which could not be
interrupted without injury, and lasted on one occasion sixteen hours),
his strength was supported by morsels of food put into his mouth by his
sister,[10] and his mind amused by her reading aloud to him the Arabian
Nights, Don Quixote, or other light works. At length, after repeated
failures, he found himself provided with a reflecting telescope--a
5-1/2-foot Gregorian--of his own construction. A copy of his first
observation with it, on the great Nebula in Orion--an object of
continual amazement and assiduous inquiry to him--is preserved by the
Royal Society. It bears the date March 4, 1774.[11]

In the following year he executed his first "review of the heavens,"
memorable chiefly as an evidence of the grand and novel conceptions
which already inspired him, and of the enthusiasm with which he
delivered himself up to their guidance. Overwhelmed with professional
engagements, he still contrived to snatch some moments for the stars;
and between the acts at the theatre was often seen running from the
harpsichord to his telescope, no doubt with that "uncommon precipitancy
which accompanied all his actions."[12] He now rapidly increased the
power and perfection of his telescopes. Mirrors of seven, ten, even
twenty feet focal length, were successively completed, and unprecedented
magnifying powers employed. His energy was unceasing, his perseverance
indomitable. In the course of twenty-one years no less than 430
parabolic specula left his hands. He had entered upon his forty-second
year when he sent his first paper to the _Philosophical Transactions_;
yet during the ensuing thirty-nine years his contributions--many of them
elaborate treatises--numbered sixty-nine, forming a series of
extraordinary importance to the history of astronomy. As a mere explorer
of the heavens his labours were prodigious. He discovered 2,500 nebulæ,
806 double stars, passed the whole firmament in review four several
times, counted the stars in 3,400 "gauge-fields," and executed a
photometric classification of the principal stars, founded on an
elaborate (and the first systematically conducted) investigation of
their relative brightness. He was as careful and patient as he was
rapid; spared no time and omitted no precaution to secure accuracy in
his observations; yet in one night he would examine, singly and
attentively, up to 400 separate objects.

The discovery of Uranus was a mere incident of the scheme he had marked
out for himself--a fruit, gathered as it were by the way. It formed,
nevertheless, the turning-point in his career. From a star-gazing
musician he was at once transformed into an eminent astronomer; he was
relieved from the drudgery of a toilsome profession, and installed as
Royal Astronomer, with a modest salary of £200 a year; funds were
provided for the construction of the forty-foot reflector, from the
great space-penetrating power of which he expected unheard-of
revelations; in fine, his future work was not only rendered possible,
but it was stamped as authoritative.[13] On Whit-Sunday 1782, William
and Caroline Herschel played and sang in public for the last time in St.
Margaret's Chapel, Bath; in August of the same year the household was
moved to Datchet, near Windsor, and on April 3, 1786, to Slough. Here
happiness and honours crowded on the fortunate discoverer. In 1788 he
married Mary, only child of James Baldwin, a merchant of the city of
London, and widow of Mr. John Pitt--a lady whose domestic virtues were
enhanced by the possession of a large jointure. The fruit of their union
was one son, of whose work--the worthy sequel of his father's--we shall
have to speak further on. Herschel was created a Knight of the
Hanoverian Guelphic Order in 1816, and in 1821 he became the first
President of the Royal Astronomical Society, his son being its first
Foreign Secretary. But his health had now for some years been failing,
and on August 25, 1822, he died at Slough, in the eighty-fourth year of
his age, and was buried in Upton churchyard.

His epitaph claims for him the lofty praise of having "burst the
barriers of heaven." Let us see in what sense this is true.

The first to form any definite idea as to the constitution of the
stellar system was Thomas Wright, the son of a carpenter living at
Byer's Green, near Durham. With him originated what has been called the
"Grindstone Theory" of the universe, which regarded the Milky Way as the
projection on the sphere of a stratum or disc of stars (our sun
occupying a position near the centre), similar in magnitude and
distribution to the lucid orbs of the constellations.[14] He was
followed by Kant,[15] who transcended the views of his predecessor by
assigning to nebulæ the position they long continued to occupy, rather
on imaginative than scientific grounds, of "island universes," external
to, and co-equal with, the Galaxy. Johann Heinrich Lambert,[16] a
tailor's apprentice from Mühlhausen, followed, but independently. The
conceptions of this remarkable man were grandiose, his intuitions bold,
his views on some points a singular anticipation of subsequent
discoveries. The sidereal world presented itself to him as a hierarchy
of systems, starting from the planetary scheme, rising to throngs of
suns within the circuit of the Milky Way--the "ecliptic of the stars,"
as he phrased it--expanding to include groups of many Milky Ways; these
again combining to form the unit of a higher order of assemblage, and so
onwards and upwards until the mind reels and sinks before the immensity
of the contemplated creations.

"Thus everything revolves--the earth round the sun; the sun round the
centre of his system; this system round a centre common to it with other
systems; this group, this assemblage of systems, round a centre which is
common to it with other groups of the same kind; and where shall we have
done?"[17]

The stupendous problem thus speculatively attempted, Herschel undertook
to grapple with experimentally. The upshot of this memorable inquiry was
the inclusion, for the first time, within the sphere of human knowledge,
of a connected body of facts, and inferences from facts, regarding the
sidereal universe; in other words, the foundation of what may properly
be called a science of the stars.

Tobias Mayer had illustrated the perspective effects which must ensue in
the stellar sphere from a translation of the solar system, by comparing
them to the separating in front and closing up behind of trees in a
forest to the eye of an advancing spectator;[18] but the appearances
which he thus correctly described he was unable to detect. By a more
searching analysis of a smaller collection of proper motions, Herschel
succeeded in rendering apparent the very consequences foreseen by Mayer.
He showed, for example, that Arcturus and Vega did, in fact, appear to
recede from, and Sirius and Aldebaran to approach, each other by very
minute amounts; and, with a striking effort of divinatory genius, placed
the "apex," or point of direction of the sun's motion, close to the star
Lambda in the constellation Hercules,[19] within a few degrees of
the spot indicated by later and indefinitely more refined methods of
research. He resumed the subject in 1805,[20] but though employing a
more rigorous method, was scarcely so happy in his result. In 1806,[21]
he made a preliminary attempt to ascertain the speed of the sun's
journey, fixing it, by doubtless much too low an estimate, at about
three miles a second. Yet the validity of his general conclusion as to
the line of solar travel, though long doubted, has been triumphantly
confirmed. The question as to the "secular parallax" of the fixed stars
was in effect answered.

With their _annual_ parallax, however, the case was very different. The
search for it had already led Bradley to the important discoveries of
the aberration of light and the nutation of the earth's axis; it was now
about to lead Herschel to a discovery of a different, but even more
elevated character. Yet in neither case was the object primarily sought
attained.

From the very first promulgation of the Copernician theory the seeming
immobility of the stars had been urged as an argument against its truth;
for if the earth really travelled in a vast orbit round the sun, objects
in surrounding space should appear to change their positions, unless
their distances were on a scale which, to the narrow ideas of the
universe then prevailing, seemed altogether extravagant.[22] The
existence of such apparent or "parallactic" displacements was
accordingly regarded as the touchstone of the new views, and their
detection became an object of earnest desire to those interested in
maintaining them. Copernicus himself made the attempt; but with his
"Triquetrum," a jointed wooden rule with the divisions marked in ink,
constructed by himself,[23] he was hardly able to measure angles of ten
minutes, far less fractions of a second. Galileo, a more impassioned
defender of the system, strained his ears, as it were, from Arcetri, in
his blind and sorrowful old age, for news of a discovery which two more
centuries had still to wait for. Hooke believed he had found a parallax
for the bright star in the Head of the Dragon; but was deceived. Bradley
convinced himself that such effects were too minute for his instruments
to measure. Herschel made a fresh attempt by a practically untried
method.

It is a matter of daily experience that two objects situated at
different distances seem to a beholder in motion to move relatively to
each other. This principle Galileo, in the third of his Dialogues on the
Systems of the World,[24] proposed to employ for the determination of
stellar parallax; for two stars, lying apparently close together, but in
reality separated by a great gulf of space, must shift their mutual
positions when observed from opposite points of the earth's orbit; or
rather, the remoter forms a virtually fixed point, to which the
movements of the other can be conveniently referred. By this means
complications were abolished more numerous and perplexing than Galileo
himself was aware of, and the problem was reduced to one of simple
micrometrical measurement. The "double-star method" was also suggested
by James Gregory in 1675, and again by Wallis in 1693;[25] Huygens
first, and afterwards Dr. Long of Cambridge (about 1750), made futile
experiments with it; and it eventually led, in the hands of Bessel, to
the successful determination of the parallax of 61 Cygni.

Its advantages were not lost upon Herschel. His attempt to assign
definite distances to the nearest stars was no isolated effort, but part
of the settled plan upon which his observations were conducted. He
proposed to sound the heavens, and the first requisite was a knowledge
of the length of his sounding-line. Thus it came about that his special
attention was early directed to double stars.

"I resolved," he writes,[26] "to examine every star in the heavens with
the utmost attention and a very high power, that I might collect such
materials for this research as would enable me to fix my observations
upon those that would best answer my end. The subject has already proved
so extensive, and still promises so rich a harvest to those who are
inclined to be diligent in the pursuit, that I cannot help inviting
every lover of astronomy to join with me in observations that must
inevitably lead to new discoveries."

The first result of these inquiries was a classed catalogue of 269
double stars presented to the Royal Society in 1782, followed, after
three years, by an additional list of 434. In both these collections the
distances separating the individuals of each pair were carefully
measured, and (with a few exceptions) the angles made with the
hour-circle by the lines joining their centres (technically called
"angles of position") were determined with the aid of a "revolving-wire
micrometer," specially devised for the purpose. Moreover, an important
novelty was introduced by the observation of the various colours visible
in the star-couples, the singular and vivid contrasts of which were now
for the first time described.

Double stars were at that time supposed to be a purely optical
phenomenon. Their components, it was thought, while in reality
indefinitely remote from each other, were brought into fortuitous
contiguity by the chance of lying nearly in the same line of sight from
the earth. Yet Bradley had noticed a change of 30°, between 1718 and
1759, in the position-angle of the two stars forming Castor, and was
thus within a hair's breadth of the discovery of their physical
connection.[27] While the Rev. John Michell, arguing by the doctrine of
probabilities, wrote as follows in 1767:--"It is highly probable in
particular, and next to a certainty in general, that such double stars
as appear to consist of two or more stars placed very near together, do
really consist of stars placed near together, and under the influence of
some general law."[28] And in 1784:[29] "It is not improbable that a few
years may inform us that some of the great number of double, triple
stars, etc., which have been observed by Mr. Herschel, are systems of
bodies revolving about each other."

This remarkable speculative anticipation had a practical counterpart in
Germany. Father Christian Mayer, a Jesuit astronomer at Mannheim, set
himself, in January 1776, to collect examples of stellar pairs, and
shortly after published the supposed discovery of "satellites" to many
of the principal stars.[30] But his observations were neither exact nor
prolonged enough to lead to useful results in such an inquiry. His
disclosures were derided; his planet-stars treated as results of
hallucination. _On n'a point cru à des choses aussi extraordinaires_,
wrote Lalande[31] within one year of a better-grounded announcement to
the same effect.

Herschel at first shared the general opinion as to the merely optical
connection of double stars. Of this the purpose for which he made his
collection is in itself sufficient evidence, since what may be called
the _differential_ method of parallaxes depends, as we have seen, for
its efficacy upon disparity of distance. It was "much too soon," he
declared in 1782,[32] "to form any theories of small stars revolving
round large ones;" while in the year following,[33] he remarked that the
identical proper motions of the two stars forming, to the naked eye, the
single bright orb of Castor could only be explained as both equally due
to the "systematic parallax" caused by the sun's movement in space.
Plainly showing that the notion of a physical tie, compelling the two
bodies to travel together, had not as yet entered into his speculations.
But he was eminently open to conviction, and had, moreover, by
observations unparalleled in amount as well as in kind, prepared ample
materials for convincing himself and others. In 1802 he was able to
announce the fact of his discovery, and in the two ensuing years, to lay
in detail before the Royal Society proofs, gathered from the labours of
a quarter of a century, of orbital revolution in the case of as many as
fifty double stars, henceforth, he declared, to be held as real binary
combinations, "intimately held together by the bond of mutual
attraction."[34] The fortunate preservation in Dr. Maskelyne's note-book
of a remark made by Bradley about 1759, to the effect that the line
joining the components of Castor was an exact prolongation of that
joining Castor with Pollux, added eighteen years to the time during
which the pair were under scrutiny, and confirmed the evidence of change
afforded by more recent observations. Approximate periods were fixed for
many of the revolving suns--for Castor 342 years; for Gamma Leonis,
1200, Delta Serpentis, 375, Eta Bootis, 1681 years; Eta Lyræ
was noted as a "double-double-star," a change of relative
situation having been detected in each of the two pairs composing the
group; and the occultation was described of one star by another in the
course of their mutual revolutions, as exemplified in 1795 by the
rapidly circulating system of Zeta Herculis.

Thus, by the sagacity and perseverance of a single observer, a firm
basis was at last provided upon which to raise the edifice of sidereal
science. The analogy long presumed to exist between the mighty star of
our system and the bright points of light spangling the firmament was
shown to be no fiction of the imagination, but a physical reality; the
fundamental quality of attractive power was proved to be common to
matter so far as the telescope was capable of exploring, and law,
subordination, and regularity to give testimony of supreme and
intelligent design no less in those limitless regions of space than in
our narrow terrestrial home. The discovery was emphatically (in Arago's
phrase) "one with a future," since it introduced the element of precise
knowledge where more or less probable conjecture had previously held
almost undivided sway; and precise knowledge tends to propagate itself
and advance from point to point.

We have now to speak of Herschel's pioneering work in the skies. To
explore with line and plummet the shining zone of the Milky Way, to
delineate its form, measure its dimensions, and search out the
intricacies of its construction, was the primary task of his life, which
he never lost sight of, and to which all his other investigations were
subordinate. He was absolutely alone in this bold endeavour. Unaided, he
had to devise methods, accumulate materials, and sift out results. Yet
it may safely be asserted that all the knowledge we possess on this
sublime subject was prepared, and the greater part of it anticipated, by
him.

The ingenious method of "star-gauging," and its issue in the delineation
of the sidereal system as an irregular stratum of evenly-scattered suns,
is the best-known part of his work. But it was, in truth, only a first
rude approximation, the principle of which maintained its credit in the
literature of astronomy a full half-century after its abandonment by its
author. This principle was the general equality of star distribution. If
equal portions of space really held equal numbers of stars, it is
obvious that the number of stars visible in any particular direction
would be strictly proportional to the range of the system in that
direction, apparent accumulation being produced by real extent. The
process of "gauging the heavens," accordingly, consisted in counting the
stars in successive telescopic fields, and calculating thence the depths
of space necessary to contain them. The result of 3,400 such operations
was the plan of the Galaxy familiar to every reader of an astronomical
text-book. Widely-varying evidence was, as might have been expected,
derived from an examination of different portions of the sky. Some
fields of view were almost blank, while others (in or near the Milky
Way) blazed with the radiance of many hundred stars compressed into an
area about one-fourth that of the full-moon. In the most crowded parts
116,000 were stated to have been passed in review within a quarter of an
hour. Here the "length of his sounding-line" was estimated by Herschel
at about 497 times the distance of Sirius--in other words, the bounding
orb, or farthest sun of the system in that direction, so far as could be
seen with the 20-foot reflector, was thus inconceivably remote. But
since the distance of Sirius, no less than of every other fixed star,
was as yet an unknown quantity, the dimensions inferred for the Galaxy
were of course purely relative; a knowledge of its form and structure
might (admitting the truth of the fundamental hypothesis) be obtained,
but its real or absolute size remained altogether undetermined.

Even as early as 1785, however, Herschel perceived traces of a tendency
which completely invalidated the supposition of any approach to an
average uniformity of distribution. This was the action of what he
called a "clustering power" in the Milky Way. "Many gathering
clusters"[35] were already discernible to him even while he endeavoured
to obtain a "true _mean_ result" on the assumption that each star in
space was separated from its neighbours as widely as the sun from
Sirius. "It appears," he wrote in 1789, "that the heavens consist of
regions where suns are gathered into separate systems"; and in certain
assemblages he was able to trace "a course or tide of stars setting
towards a centre," denoting, not doubtfully, the presence of attractive
forces.[36] Thirteen years later, he described our sun and his
constellated companions as surrounded by "a magnificent collection of
innumerable stars, called the Milky Way, which must occasion a very
powerful balance of opposite attractions to hold the intermediate stars
at rest. For though our sun, and all the stars we see, may truly be said
to be in the plane of the Milky Way, yet I am now convinced, by a long
inspection and continued examination of it, that the Milky Way itself
consists of stars very differently scattered from those which are
immediately about us." "This immense aggregation," he added, "is by no
means uniform. Its component stars show evident signs of clustering
together into many separate allotments."[37]

The following sentences, written in 1811, contain a definite
retractation of the view frequently attributed to him:--

"I must freely confess," he says, "that by continuing my sweeps of the
heavens my opinion of the arrangement of the stars and their magnitudes,
and of some other particulars, has undergone a gradual change; and
indeed, when the novelty of the subject is considered, we cannot be
surprised that many things formerly taken for granted should on
examination prove to be different from what they were generally but
incautiously supposed to be. For instance, an equal scattering of the
stars may be admitted in certain calculations; but when we examine the
Milky Way, or the closely compressed clusters of stars of which my
catalogues have recorded so many instances, this supposed equality of
scattering must be given up."[38]

Another assumption, the fallacy of which he had not the means of
detecting since become available, was retained by him to the end of his
life. It was that the brightness of a star afforded an approximate
measure of its distance. Upon this principle he founded in 1817 his
method of "limiting apertures,"[39] by which two stars, brought into
view in two precisely similar telescopes, were "equalised" by covering a
certain portion of the object-glass collecting the more brilliant rays.
The distances of the orbs compared were then taken to be in the ratio of
the reduced to the original apertures of the instruments with which they
were examined. If indeed the absolute lustre of each were the same, the
result might be accepted with confidence; but since we have no warrant
for assuming a "standard star" to facilitate our computations, but much
reason to suppose an indefinite range, not only of size but of intrinsic
brilliancy, in the suns of our firmament, conclusions drawn from such a
comparison are entirely worthless.

In another branch of sidereal science besides that of stellar
aggregation, Herschel may justly be styled a pioneer. He was the first
to bestow serious study on the enigmatical objects known as "nebulæ."
The history of the acquaintance of our race with them is comparatively
short. The only one recognised before the invention of the telescope was
that in the girdle of Andromeda, certainly familiar in the middle of the
tenth century to the Persian astronomer Abdurrahman Al-Sûfi; and marked
with dots on Spanish and Dutch constellation-charts of the fourteenth
and fifteenth centuries.[40] Yet so little was it noticed that it might
practically be said--as far as Europe is concerned--to have been
discovered in 1612 by Simon Marius (Mayer of Genzenhausen), who aptly
described its appearance as that of a "candle shining through horn." The
first mention of the great Orion nebula is by a Swiss Jesuit named
Cysatus, who succeeded Father Scheiner in the chair of mathematics at
Ingolstadt. He used it, apparently without any suspicion of its novelty,
as a term of comparison for the comet of December 1618.[41] A novelty,
nevertheless, to astronomers it still remained in 1656, when Huygens
discerned, "as it were, an hiatus in the sky, affording a glimpse of a
more luminous region beyond."[42] Halley in 1716 knew of six nebulæ,
which he believed to be composed of a "lucid medium" diffused through
the ether of space.[43] He appears, however, to have been unacquainted
with some previously noticed by Hevelius. Lacaille brought back with him
from the Cape a list of forty-two--the first-fruits of observation in
Southern skies--arranged in three numerically equal classes;[44] and
Messier (nicknamed by Louis XV. the "ferret of comets"), finding such
objects a source of extreme perplexity in the pursuit of his chosen
game, attempted to eliminate by methodising them, and drew up a
catalogue comprising, in 1781, 103 entries.[45]

These preliminary attempts shrank into insignificance when Herschel
began to "sweep the heavens" with his giant telescopes. In 1786 he
presented to the Royal Society a descriptive catalogue of 1,000 nebulæ
and clusters, followed, three years later, by a second of as many more;
to which he added in 1802 a further gleaning of 500. On the subject of
their nature his views underwent a remarkable change. Finding that his
potent instruments resolved into stars many nebulous patches in which no
signs of such a structure had previously been discernible, he naturally
concluded that "resolvability" was merely a question of distance and
telescopic power. He was (as he said himself) led on by almost
imperceptible degrees from evident clusters, such as the Pleiades, to
spots without a trace of stellar formation, the gradations being so well
connected as to leave no doubt that all these phenomena were equally
stellar. The singular variety of their appearance was thus described by
him:--

"I have seen," he says, "double and treble nebulæ variously arranged;
large ones with small, seeming attendants; narrow, but much extended
lucid nebulæ or bright dashes; some of the shape of a fan, resembling an
electric brush, issuing from a lucid point; others of the cometic shape,
with a seeming nucleus in the centre, or like cloudy stars surrounded
with a nebulous atmosphere; a different sort, again, contain a
nebulosity of the milky kind, like that wonderful, inexplicable
phenomenon about Theta Orionis; while others shine with a fainter,
mottled kind of light, which denotes their being resolvable into
stars."[46]

"These curious objects" he considered to be "no less than whole sidereal
systems,"[47] some of which might "well outvie our Milky Way in
grandeur." He admitted, however, a wide diversity in condition as well
as compass. The system to which our sun belongs he described as "a very
extensive branching congeries of many millions of stars, which probably
owes its origin to many remarkably large as well as pretty closely
scattered small stars, that may have drawn together the rest."[48] But
the continued action of this same "clustering power" would, he supposed,
eventually lead to the breaking-up of the original majestic Galaxy into
two or three hundred separate groups, already visibly gathering. Such
minor nebulæ, due to the "decay" of other "branching nebulæ" similar to
our own, he recognised by the score, lying, as it were, stratified in
certain quarters of the sky. "One of these nebulous beds," he informs
us, "is so rich that in passing through a section of it, in the time of
only thirty-six minutes, I detected no less than thirty-one nebulæ, all
distinctly visible upon a fine blue sky." The stratum of Coma Berenices
he judged to be the nearest to our system of such layers; nor did the
marked aggregation of nebulæ towards both poles of the circle of the
Milky Way escape his notice.

By a continuation of the same process of reasoning, he was enabled (as
he thought) to trace the life-history of nebulæ from a primitive loose
and extended formation, through clusters of gradually increasing
compression, down to the kind named by him "Planetary" because of the
defined and uniform discs which they present. These he regarded as "very
aged, and drawing on towards a period of change or dissolution."[49]

"This method of viewing the heavens," he concluded, "seems to throw them
into a new kind of light. They now are seen to resemble a luxuriant
garden which contains the greatest variety of productions in different
flourishing beds; and one advantage we may at least reap from it is,
that we can, as it were, extend the range of our experience to an
immense duration. For, to continue the simile which I have borrowed from
the vegetable kingdom, is it not almost the same thing whether we live
successively to witness the germination, blooming, foliage, fecundity,
fading, withering, and corruption of a plant, or whether a vast number
of specimens, selected from every stage through which the plant passes
in the course of its existence, be brought at once to our view?"[50]

But already this supposed continuity was broken. After mature
deliberation on the phenomena presented by nebulous stars, Herschel was
induced, in 1791, to modify essentially his original opinion.

"When I pursued these researches," he says, "I was in the situation of a
natural philosopher who follows the various species of animals and
insects from the height of their perfection down to the lowest ebb of
life; when, arriving at the vegetable kingdom, he can scarcely point out
to us the precise boundary where the animal ceases and the plant begins;
and may even go so far as to suspect them not to be essentially
different. But, recollecting himself, he compares, for instance, one of
the human species to a tree, and all doubt upon the subject vanishes
before him. In the same manner we pass through gentle steps from a
coarse cluster of stars, such as the Pleiades ... till we find ourselves
brought to an object such as the nebula in Orion, where we are still
inclined to remain in the once adopted idea of stars exceedingly remote
and inconceivably crowded, as being the occasion of that remarkable
appearance. It seems, therefore, to require a more dissimilar object to
set us right again. A glance like that of the naturalist, who casts his
eye from the perfect animal to the perfect vegetable, is wanting to
remove the veil from the mind of the astronomer. The object I have
mentioned above is the phenomenon that was wanting for this purpose.
View, for instance, the 19th cluster of my 6th class, and afterwards
cast your eye on this cloudy star, and the result will be no less
decisive than that of the naturalist we have alluded to. Our judgment, I
may venture to say, will be, that _the nebulosity about the star is not
of a starry nature_."[51]

The conviction thus arrived at of the existence in space of a widely
diffused "shining fluid" (a conviction long afterwards fully justified
by the spectroscope) led him into a field of endless speculation. What
was its nature? Should it "be compared to the coruscation of the
electric fluid in the aurora borealis? or to the more magnificent cone
of the zodiacal light?" Above all, what was its function in the cosmos?
And on this point he already gave a hint of the direction in which his
mind was moving by the remark that this self-luminous matter seemed
"more fit to produce a star by its condensation, than to depend on the
star for its existence."[52]

This was not a novel idea. Tycho Brahe had tried to explain the blaze of
the star of 1572 as due to a sudden concentration of nebulous material
in the Milky Way, even pointing out the space left dark and void by the
withdrawal of the luminous stuff; and Kepler, theorising on a similar
stellar apparition in 1604, followed nearly in the same track. But under
Herschel's treatment the nebular origin of stars first acquired the
consistency of a formal theory. He meditated upon it long and earnestly,
and in two elaborate treatises, published respectively in 1811 and 1814,
he at length set forth the arguments in its favour. These rested
entirely upon the "principle of continuity." Between the successive
classes of his assortment of developing objects there was, as he said,
"perhaps not so much difference as would be in an annual description of
the human figure, were it given from the birth of a child till he comes
to be a man in his prime."[53] From diffused nebulosity, barely visible
in the most powerful light-gathering instruments, but which he estimated
to cover nearly 152 square degrees of the heavens,[54] to planetary
nebulæ, supposed to be already centrally solid, instances were alleged
of every stage and phase of condensation. The validity of his reasoning,
however, was evidently impaired by his confessed inability to
distinguish between the dim rays of remote clusters and the milky light
of true gaseous nebulæ.

It may be said that such speculations are futile in themselves, and
necessarily barren of results. But they gratify an inherent tendency of
the human mind, and, if pursued in a becoming spirit, should be neither
reproved nor disdained. Herschel's theory still holds the field, the
testimony of recent discoveries with regard to it having proved strongly
confirmatory of its principle, although not of its details. Strangely
enough, it seems to have been propounded in complete independence of
Laplace's nebular hypothesis as to the origin of the solar system.
Indeed, it dated, as we have seen, in its first inception, from 1791,
while the French geometrician's view was not advanced until 1796.

We may now briefly sum up the chief results of Herschel's long years of
"watching the heavens." The apparent motions of the stars had been
disentangled; one portion being clearly shown to be due to a translation
towards a point in the constellation Hercules of the sun and his
attendant planets; while a large balance of displacement was left to be
accounted for by real movements, various in extent and direction, of the
stars themselves. By the action of a central force similar to, if not
identical with, gravity, suns of every degree of size and splendour, and
sometimes brilliantly contrasted in colour, were seen to be held
together in systems, consisting of two, three, four, even six members,
whose revolutions exhibited a wide range of variety both in period and
in orbital form. A new department of physical astronomy was thus
created,[55] and rigid calculation for the first time made possible
within the astral region. The vast problem of the arrangement and
relations of the millions of stars forming the Milky Way was shown to be
capable of experimental treatment, and of at least partial solution,
notwithstanding the variety and complexity seen to prevail, to an extent
previously undreamt of, in the arrangement of that majestic system. The
existence of a luminous fluid, diffused through enormous tracts of
space, and intimately associated with stellar bodies, was virtually
demonstrated, and its place and use in creation attempted to be divined
by a bold but plausible conjecture. Change on a stupendous scale was
inferred or observed to be everywhere in progress. Periodical stars
shone out and again decayed; progressive ebbings or flowings of light
were indicated as probable in many stars under no formal suspicion of
variability; forces were everywhere perceived to be at work, by which
the very structure of the heavens themselves must be slowly but
fundamentally modified. In all directions groups were seen to be formed
or forming; tides and streams of suns to be setting towards powerful
centres of attraction; new systems to be in process of formation, while
effete ones hastened to decay or regeneration when the course appointed
for them by Infinite Wisdom was run. And thus, to quote the words of the
observer who "had looked farther into space than ever human being did
before him,"[56] the state into which the incessant action of the
clustering power has brought the Milky Way at present, is a kind of
chronometer that may be used to measure the time of its past and future
existence; and although we do not know the rate of going of this
mysterious chronometer, it is nevertheless certain that, since the
breaking-up of the parts of the Milky Way affords a proof that it cannot
last for ever, it equally bears witness that its past duration cannot be
admitted to be infinite.[57]


FOOTNOTES:

[Footnote 3: _Phil. Trans._, vol. xxx., p. 737.]

[Footnote 4: Out of eighty stars compared, fifty-seven were found to
have changed their places by more than 10". Lesser discrepancies were at
that time regarded as falling within the limits of observational error.
_Tobiæ Mayeri Op. Inedita_, t. i., pp. 80, 81, and Herschel in _Phil.
Trans._, vol. lxxiii., pp. 275-278.]

[Footnote 5: _Posthumous Works_, p. 701.]

[Footnote 6: Arago in _Annuaire du Bureau des Longitudes_, 1842, p.
313.]

[Footnote 7: Bradley to Halley, _Phil. Trans._, vol. xxxv. (1728), p.
660. His observations were directly applicable to only two stars,
Gamma Draconis and Eta Ursæ Majoris, but some lesser ones
were included in the same result.]

[Footnote 8: Holden, _Sir William Herschel, his Life and Works_, p. 17.]

[Footnote 9: _Phil. Trans._, vol. ci., p. 269.]

[Footnote 10: Caroline Lucretia Herschel, born at Hanover, March 16,
1750, died in the same place, January 9, 1848. She came to England in
1772, and was her brother's devoted assistant, first in his musical
undertakings, and afterwards, down to the end of his life, in his
astronomical labours.]

[Footnote 11: Holden, _op. cit._, p. 39.]

[Footnote 12: _Memoir of Caroline Herschel_, p. 37.]

[Footnote 13: See Holden's _Sir William Herschel_, p. 54.]

[Footnote 14: _An Original Theory or New Hypothesis of the Universe_,
London, 1750. See also De Morgan's summary of his views in
_Philosophical Magazine_, April, 1848.]

[Footnote 15: _Allgemeine Naturgeschichte und Theorie des Himmels_,
1755.]

[Footnote 16: _Cosmologische Briefe_, Augsburg, 1761.]

[Footnote 17: _The System of the World_, p. 125, London, 1800 (a
translation of _Cosmologische Briefe_). Lambert regarded nebulæ as
composed of stars crowded together, but _not_ as external universes. In
the case of the Orion nebula, indeed, he throws out such a conjecture,
but afterwards suggests that it may form a centre for that one of the
subordinate systems composing the Milky Way to which our sun belongs.]

[Footnote 18: _Opera Inedita_, t. i., p. 79.]

[Footnote 19: _Phil. Trans._, vol. lxxiii. (1783), p. 273. Pierre
Prévost's similar investigation, communicated to the Berlin Academy of
Sciences four months later, July 3, 1783, was inserted in the _Memoirs_
of that body for 1781, and thus _seems_ to claim a priority not its due.
Georg Simon Klügel at Halle gave about the same time an analytical
demonstration of Herschel's result. Wolf, _Gesch. der Astronomie_, p.
733.]

[Footnote 20: _Phil. Trans._, vol. xcv., p. 233.]

[Footnote 21: _Ibid._, vol. xcvi., p. 205.]

[Footnote 22: "Ingens bolus devorandus est," Kepler admitted to Herwart
in May, 1603.]

[Footnote 23: Described in "Præfatio Editoris" to _De Revolutionibus_,
p. xix. (ed. 1854).]

[Footnote 24: _Opere_, t. i., p. 415.]

[Footnote 25: _Phil. Trans._, vol. xvii., p. 848.]

[Footnote 26: _Ibid._, vol. lxxii., p. 97.]

[Footnote 27: Doberck, _Observatory_, vol. ii., p. 110.]

[Footnote 28: _Phil. Trans._, vol. lvii., p. 249.]

[Footnote 29: _Ibid._, vol. lxxiv., p. 56.]

[Footnote 30: _Beobachtungen von Fixsterntrabanten_, 1778; and _De Novis
in Coelo Sidereo Phænomenis_, 1779.]

[Footnote 31: _Bibliographie_, p. 569.]

[Footnote 32: _Phil. Trans._, vol. lxxii., p. 162.]

[Footnote 33: _Ibid._, vol. lxxiii., p. 272.]

[Footnote 34: _Ibid._, vol. xciii., p. 340.]

[Footnote 35: _Phil. Trans._, vol. lxxv., p. 255.]

[Footnote 36: _Ibid._, vol. lxxix., pp. 214, 222.]

[Footnote 37: _Ibid._, vol. xcii., pp. 479, 495.]

[Footnote 38: _Phil. Trans._, vol. ci., p. 269.]

[Footnote 39: _Ibid._, vol. cvii., p. 311.]

[Footnote 40: Bullialdus, _De Nebulosâ Stellâ in Cingulo Andromedæ_
(1667); see also G. P. Bond, _Mém. Am. Ac._, vol. iii., p. 75, Holden's
Monograph on the Orion Nebula, _Washington Observations_, vol. xxv.,
1878 (pub. 1882), and Lady Huggins's drawing, _Atlas of Spectra_, p.
119.]

[Footnote 41: _Mathemata Astronomica_, p. 75.]

[Footnote 42: _Systema Saturnium_, p. 9.]

[Footnote 43: _Phil. Trans._, vol. xxix., p. 390.]

[Footnote 44: _Mém. Ac. des Sciences_, 1755.]

[Footnote 45: _Conn. des Temps_, 1784 (pub. 1781), p. 227. A previous
list of forty-five had appeared in _Mém. Ac. des Sciences_, 1771.]

[Footnote 46: _Phil. Trans._, vol. lxxiv., p. 442.]

[Footnote 47: _Ibid._, vol. lxxix., p. 213.]

[Footnote 48: _Ibid._, vol. lxxv., p. 254.]

[Footnote 49: _Ibid._, vol. lxxix., p. 225.]

[Footnote 50: _Phil. Trans._, vol. lxxix., p. 226.]

[Footnote 51: _Ibid._, vol. lxxxi., p. 72.]

[Footnote 52: _Ibid._, p. 85.]

[Footnote 53: _Phil. Trans._, vol. ci., p. 271.]

[Footnote 54: _Ibid._, p. 277.]

[Footnote 55: J. Herschel, _Phil. Trans._, vol. cxvi., part iii., p. 1.]

[Footnote 56: His own words to the poet Campbell cited by Holden, _Life
and Works_, p. 109.]

[Footnote 57: _Phil. Trans._, vol. civ., p. 283.]


A Popular History of Astronomy During the
Nineteenth Century, by Agnes M. (Agnes Mary) Clerke