Monday, 5 January 2015

CPU RAM AND MOTHERBOARD TROUBLE SHOOTING

Random-access memory (RAM /ræm/) is a form of computer data storage. A random-access memory device allows data items to be read and written in roughly the same amount of time regardless of the order in which data items are accessed.[1] In contrast, with other direct-access data storage media such as hard disks, CD-RWs, DVD-RWs and the older drum memory, the time required to read and write data items varies significantly depending on their physical locations on the recording medium, due to mechanical limitations such as media rotation speeds and arm movement delays.
Today, random-access memory takes the form of integrated circuits. RAM is normally associated with volatile types of memory (such as DRAM memory modules), where stored information is lost if power is removed, although many efforts have been made to develop non-volatile RAM chips.[2] Other types of non-volatile memory exist that allow random access for read operations, but either do not allow write operations or have limitations on them. These include most types of ROM and a type of flash memory called NOR-Flash.
Integrated-circuit RAM chips came into the market in the late 1960s, with the first commercially available DRAM chip, the Intel 1103, introduced in October 1970.[3]

Contents

History


These IBM tabulating machines from the 1930s used mechanical counters to store information

A portion of a core memory with a modern flash RAM SD card on top

1 Megabit chip – one of the last models developed by VEB Carl Zeiss Jena in 1989
Early computers used relays, mechanical counters[4] or delay lines for main memory functions. Ultrasonic delay lines could only reproduce data in the order it was written. Drum memory could be expanded at relatively low cost but efficient retrieval of memory items required knowledge of the physical layout of the drum to optimize speed. Latches built out of vacuum tube triodes, and later, out of discrete transistors, were used for smaller and faster memories such as registers. Such registers were relatively large and too costly to use for large amounts of data; generally only a few dozen or few hundred bits of such memory could be provided.
The first practical form of random-access memory was the Williams tube starting in 1947. It stored data as electrically charged spots on the face of a cathode ray tube. Since the electron beam of the CRT could read and write the spots on the tube in any order, memory was random access. The capacity of the Williams tube was a few hundred to around a thousand bits, but it was much smaller, faster, and more power-efficient than using individual vacuum tube latches. Developed at the University of Manchester in England, the Williams tube provided the medium on which the first electronically stored-memory program was implemented in the Manchester Small-Scale Experimental Machine (SSEM) computer, which first successfully ran a program on 21 June 1948.[5] In fact, rather than the Williams tube memory being designed for the SSEM, the SSEM was a testbed to demonstrate the reliability of the memory.[6][7]
Magnetic-core memoryWhen shopping for motherboards for a PC, two of the most important specifications are the CPU and the RAM. Both of these elements affect how fast your computer runs, but in very different ways, as they have very different functions. In order to get the most performance out of your computer, it’s best to understand the roles CPU and RAM play in your PC’s overall speed.
What is a CPU?
CPU stands for Central Processing Unit. The CPU is the brains of your computer. It processes all the mathematical calculations required to make programs run. The speed of your CPU is measured in hertz (scaling up to megahertz and gigahertz), or how many cycles it can process per second. When your computer runs programs, it’s essentially following a set of instructions one at a time. CPUs can have multiple cores (i.e. dual-core or quad-core), which allows it to perform several processes at once. In a way, it’s a bit like the checkout line at the grocery store—adding a core is like opening an additional register, allowing the CPU to process instructions twice as fast.
What is RAM?
RAM stands for Random Access Memory. This is a different kind of memory that is used to store files permanently on your hard drive. Rather, RAM could be considered short term memory. The data stored in your RAM is the information that is necessary to keep a program running or complete a process. Think of it a bit like the work space for a math problem where you work out the smaller calculations that help you determine the solution to the entire problem.
To this end, RAM is incredibly important to multitasking (i.e. running multiple programs at once). RAM allows a program to continue running in the background so you can return to it without re-launching it.
How RAM and the CPU work together
As a general rule of thumb, the more RAM you have, the faster your computer will be when running multiple applications. That’s because it’s much faster for the CPU to read data from the RAM than from the hard drive. For example, if the CPU were a carpenter, the RAM would be his tool belt. The bigger his tool belt, the less time he has to waste going all the way back to his truck or tool chest.
So, what will increase the speed of your computer the most? It depends on where the bottleneck is. You can check b

Figure 1: The Central Processing Unit
The computer does its primary work in a part of the machine we cannot see, a control center that converts data input to information output. This control center, called the central processing unit (CPU), is a highly complex, extensive set of electronic circuitry that executes stored program instructions. All computers, large and small, must have a central processing unit. As Figure 1 shows, the central processing unit consists of two parts: The control unit and the arithmetic/logic unit. Each part has a specific function.
Before we discuss the control unit and the arithmetic/logic unit in detail, we need to consider data storage and its relationship to the central processing unit. Computers use two types of storage: Primary storage and secondary storage. The CPU interacts closely with primary storage, or main memory, referring to it for both instructions and data. For this reason this part of the reading will discuss memory in the context of the central processing unit. Technically, however, memory is not part of the CPU.

Recall that a computer's memory holds data only temporarily, at the time the computer is executing a program. Secondary storage holds permanent or semi-permanent data on some external magnetic or optical medium. The diskettes and CD-ROM disks that you have seen with personal computers are secondary storage devices, as are hard disks. Since the physical attributes of secondary storage devices determine the way data is organized on them, we will discuss secondary storage and data organization together in another part of our on-line readings.

Now let us consider the components of the central processing unit.


  • The Control Unit
    The control unit of the CPU contains circuitry that uses electrical signals to direct the entire computer system to carry out, or execute, stored program instructions. Like an orchestra leader, the control unit does not execute program instructions; rather, it directs other parts of the system to do so. The control unit must communicate with both the arithmetic/logic unit and memory.

  • The Arithmetic/Logic Unit
    The arithmetic/logic unit (ALU) contains the electronic circuitry that executes all arithmetic and logical operations.

    The arithmetic/logic unit can perform four kinds of arithmetic operations, or mathematical calculations: addition, subtraction, multiplication, and division. As its name implies, the arithmetic/logic unit also performs logical operations. A logical operation is usually a comparison. The unit can compare numbers, letters, or special characters. The computer can then take action based on the result of the comparison. This is a very important capability. It is by comparing that a computer is able to tell, for instance, whether there are unfilled seats on airplanes, whether charge- card customers have exceeded their credit limits, and whether one candidate for Congress has more votes than another.

    Logical operations can test for three conditions:
    • Equal-to condition. In a test for this condition, the arithmetic/logic unit compares two values to determine if they are equal. For example: If the number of tickets sold equals the number of seats in the auditorium, then the concert is declared sold out.
    • Less-than condition. To test for this condition, the computer compares values to determine if one is less than another. For example: If the number of speeding tickets on a driver's record is less than three, then insurance rates are $425; otherwise, the rates are $500.
    • Greater-than condition. In this type of comparison, the computer determines if one value is greater than another. For example: If the hours a person worked this week are greater than 40, then multiply every extra hour by 1.5 times the usual hourly wage to compute overtime pay.

    A computer can simultaneously test for more than one condition. In fact, a logic unit can usually discern six logical relationships: equal to, less than, greater than, less than or equal to, greater than or equal to, and not equal.

    The symbols that let you define the type of comparison you want the computer to perform are called relational operators. The most common relational operators are the equal sign(=), the less-than symbol(<), and the greater-than symbol(>).

    • Registers: Temporary Storage Areas
      The CPU processes (performs instructions on things, such as adding) stuff in memory. RAM is just part of the memory pyramid (see below). So when you are processing lots of data, that data ( or maybe large portions of it) will likely get loaded into RAM so it is ready for the cpu, this is to speed things up because RAM is faster to access than storage devices. So CPU usage and RAM can often correlate, but don't have to.
      A basic example might be an image editing program. I load up my 20MB jpeg, the program reads the entire image, and the OS keeps that in RAM for you (all working memory looks the same to the program, the OS decides if it goes to the page/swap file on disk or RAM). So the image is in RAM waiting to be processed, but I go for coffee before telling the program to apply so silly filter, so the CPU isn't doing anything. I come back, apply the filter to add some bubbles to the image, and the CPU goes to 100% and even more memory gets used because it keeps the preprocessed image in memory, so I can undo the change I just made. When the program is done adding the bubbles, the CPU drops, but maybe not the memory.
      Of course, it isn't quite this simple :-)
      Picture of Memory Pyramid
      share|improve this answer

      1  
      +1 very good explanation –  Antoine Benkemoun Dec 3 '09 at 13:35
      It's the same relationship as your brain have with a book. The faster brain = the faster your read, the bigger the book = the more pages it can contain.
      share|improve this answer

      1  
      Oohh... nice analogy! –  MikeyB Dec 3 '09 at 14:35
      RAM is used to save data. CPU time is used to process data.
      There is no relationship between CPU and memory usage. A process can occupy all CPUs of a system but use only a minimal amount of memory. Also, a process can allocate all memory available on a system but only use minimal CPU time. So there is no relation between both.
      share|improve this answer

      They are not related. You some tasks use a lot of just one of those resources and some use a lot of both.
      share|impr
  • 0 comments:

    Post a Comment