COMPUTER
ANONYMOUS PORT,DESKTOP COMPUTER,USB PORT,UNIVERSAL SERIAL BUS,AMD ATHLON,AMD CHIPSETS AMD MICROPROCESSOR,INTEL ,DUAL CORE

Get the latest and interesting computer tips tricks ,and hidden knowledge computer knowledge










from my latest and top most rated website









http://www.setmech.com/

 

SERVER HARDWARE

By LOVE

Although servers can be built from commodity computer components—particularly for low-load and/or non-critical applications—dedicated, high-load, mission-critical servers use specialized hardware that is optimized for the needs of servers.


A server rack seen from the back

For example, servers may incorporate “industrial-strength” mechanical components such as disk drives and computer fans that provide very high reliability and performance at a correspondingly high price. Aesthetic considerations are ignored, since most servers operate in unattended computer rooms and are only visited for maintenance or repair purposes. Although servers usually require large amounts of disk space, smaller disk drives may still be used in a trade-off of capacity vs. reliability.



CPU speeds are far less critical for many servers than they are for many desktops. Not only are typical server tasks likely to be delayed more by I/O requests than processor requirements, but the lack of any graphical user interface (GUI) in many servers frees up very large amounts of processing power for other tasks, making the overall processor power requirement lower. If a great deal of processing power is required in a server, there is a tendency to add more CPUs rather than increase the speed of a single CPU, again for reasons of reliability and redundancy.

The lack of a GUI in a server (or the rare need to use it) makes it unnecessary to install expensive video adapters. Similarly, elaborate audio interfaces, joystick connections, USB peripherals, and the like are usually unnecessary.

Because servers must operate continuously and reliably, noisy but efficient and trustworthy fans may be used for ventilation instead of inexpensive and quiet fans; and in some cases, centralized air-conditioning may be used to keep servers cool, instead of or in addition to fans. Special uninterruptible power supplies may be used to ensure that the servers continue to run in the event of a power failure.

Typical servers include heavy-duty network connections in order to allow them to handle the large amounts of traffic that they typically receive and generate as they receive and reply to client requests.

The major difference between servers and desktop computers is not in the hardware but in the software. Servers often run operating systems that are designed specifically for use in servers. They also run special applications that are designed specifically to carry out server tasks.

Servers have a unique property where the more powerful and complex the system, the longer it takes for the hardware to turn on and begin loading the operating system. Servers often do extensive pre-boot memory testing and verification, along with starting up remote management services. The hard drive controllers then start up banks of drives in sequence so as not to overload the power supply with the sudden surge of everything turning on at once, then followed by RAID system prechecks for correct operation of redundancy. It is not uncommon for all these preboot hardware checks to take several minutes, but then for the machine to run continuously for over a year of uptime.

 

Some popular operating systems for servers—such as FreeBSD, Solaris, and Linux—are derived from or similar to the UNIX operating system. UNIX was originally a minicomputer operating system, and as servers gradually replaced traditional minicomputers, UNIX was a logical and efficient choice of operating system for the servers. UNIX-based systems, many of which are free, are the most popular.


Server-oriented operating systems tend to have certain features in common that make them more suitable for the server environment, such as the absence of a GUI (or an optional GUI); the ability to be reconfigured (in both hardware and software) to at least some extent without stopping the system; advanced backup facilities to permit online backups of critical data at regular and frequent intervals; facilities to enable the movement of data between different volumes or devices in such a way that is transparent to the end user; flexible and advanced networking capabilities; features (such as daemons in UNIX or services in Windows) that make unattended execution of programs more reliable; tight system security, with advanced user, resource, data, and memory protection, and so on. Server-oriented operating systems in many cases can interact with hardware sensors to detect conditions such as overheating, processor and disk failure, and either alert an operator, take remedial action, or both, depending on the configuration.

Because the requirements of servers are, in some cases, almost diametrically opposed to those of desktop computers, it is extremely difficult to design an operating system that handles both environments well; thus, operating systems that are well suited to the desktop may not be ideal for servers and vice versa. Regardless of OS vendor, system configurations that are ideal for servers may be unsatisfactory for desktop use, and configurations that perform well on the desktop may leave much to be desired on servers. As a result many operating systems have both a server and a desktop version released. Nevertheless, the desktop versions of Windows and the Mac OS X (also Unix-based) operating systems are used on a minority of servers, as are some proprietary mainframe operating systems, such as z/OS. The dominant operating systems among servers continues to be UNIX versions and clones.

The rise of the microprocessor-based server was facilitated by the development of several versions of Unix to run on the Intel x86 microprocessor architecture. The Microsoft Windows family of operating systems also runs on Intel hardware, and versions beginning with Windows NT have incorporated features making them suitable for use on servers.


Whilst the role of server and desktop operating systems remains distinct, improvements in both hardware performance and reliability and operating system reliability have blurred the distinction between these two classes of system, which at one point remained largely separate in terms of code base, hardware and vendor providers. Today, many desktop and server operating systems share the same code base, and differ chiefly in terms of configuration. Furthermore, the rationalisation of many corporate applications towards web-based and middleware platforms has lessened the demand for specialist application servers.

 

The USB stumper


I have only one USB slot on my laptop. I use my laptop to play music and watch television, but mostly to work on my novel. I like being able to move around while I'm working, or take it to another room, or another place entirely.

I only have one printer/scanner/copier. If I want to print something from my laptop, I've got to crawl under my desk, pull out my USB printer cable, plug it into my laptop, and then do everything in reverse when I'm done.

Plus, I only have one USB slot on my laptop! If I wanted to plug in some future prolific jump drive and a PDA, a fight could break out and I could get hurt. And what if I get more than four USB things to plug into my PC? I could have a device riot and my desk would be ripped to shreds.

Fortunately, there are a multitude of solutions. There are USB extenders that lengthen your USB reach. There are hubs that will split one USB port into many. Some of these also sit on your desktop so you won't smack your head on its underside anymore. No more stars and cussing. Unless you want to.

There are cards you (if you are brave) or your computer guy or girl can add into your computer to provide more slots. There are all kinds of cool adapters that convert just about any other weird plug or slide thingy into a USB one. These also work for those laptops that don't have any USB slots. How dare they!

Most of these things include wireless capabilities, too. Soon everything will be wireless, cables will become the new floppy disk, and there will be invisible signals dancing everywhere. Maybe they'll add laser lights and we can have little shows.

I've converted to the USB age. Yeah, it took a little longer than some of you, but I still have Watergate and flowers on my mind. Okay I'm exaggerating, but I know I'm not alone and I hope this makes it easier for others who feel a little overwhelmed by it all.

I still haven't seen that bus full of serial killers either.

Shawn is an experienced freelance editor, proofreader, and writer. An award-winning journalist, columnist, and trumpet player, her knowledge of performance will enhance your copy. Visit Editing, Proofreading and Writing With a Punch! for grammar, punctuation, and spelling snippets, a trade talk blog, and superior services for help with all your writing needs.

For some fun and entertainment, visit Shawniverse for stories, poems, and an unusually entertaining blog.

More information about computers, electronics, and stuff to buy at great prices backed by excellent customer service can be found at sewelldirect.com, where Shawn was the former editor.

Article Source: http://EzineArticles.com/?expert=Shawn_Shearer

 

There's a war going on and the latest battle continues to offer the promise of good prices to consumers looking for the best stuff for their computer.


Since the inception of dual-core processors a little while back the race has been on to see who can get the better chip with this dual-core processor technology.

During the final days of the single-core battles, there was a stalemate between AMD and Intel. Intel's clocked higher but were unable to match the speeds that the AMD managed at lower clock speeds.

The oldest difference between them has been their suitability for specific tasks. AMD have had the gaming sector in the bag, especially in terms of the value for money possible with their lower clocked chips, which could be overclocked to the same speeds as their top models. Intel has the crown for general performance. When it comes to office related tasks, Intel processors are able to outperform AMD chips in these areas.

As the ability to clock the chips any higher became more and more difficult technically, the next step was to just add another core, theoretically allowing twice as much number crunching in a dual-core processor. This is not exactly how it works however.

Modern operating systems and programs have not been designed with multiple-core or dual-core processors in mind. They were designed to make use of one core on one processor. The major expense that went with multiple processor computers was the circuitry needed to split tasks up amongst the processors and sharing cache.

Dual-core processors simply act like two places for tasks to go. Instead of single tasks getting split up and performed in two different places, as is the case with traditional single core processors, single threads get split up amongst the cores. This essentially means that each program gets assigned to a core.

Because of this there is absolutely no increase in speed for gaming from dual-core processor chips. Only once the games themselves have been programmed to take advantage of dual-core processors will there be a difference. This is due to the intensive nature of games and the number crunching needed for intense graphics, which for now cannot be split over multiple cores in dual-core processors.

So back to the battlefield.

AMD were the first to introduce their dual-core processor solution to a desktop computer. This gave them a slight lead over Intel. Despite this, AMD gave people a bit of a surprise with their new offering.

Always having been renowned for giving far more than expected for the price, these new dual-core processors were very expensive. Part of what managed to give AMD a hold in a market previously dominated by Intel was their good pricing. This shock did not go down well with consumers.

To add insult to injury, Intel's dual-core processor offerings came in at remarkably good value. Both of their initial dual-core processors cost less than AMD's lowest priced model. That's right, AMD's cheapest dual-core processor cost more than Intel's most expensive. This definitely put the ball in Intel's court and was downright disappointing for AMD fans.

AMD did manage to introduce a cheaper model to compete better with the Intel offerings. Despite this, Intel was still the forerunner in this area.
Performance remains an area that is sketchy.

With the relatively new technology involved it is hard to draw a clear conclusion on who is faster. With operating systems only recently oriented towards fully utilizing dual-core processor technology, it is still new territory. Both offer increased performance, but as to who will rule the roost, we'll have to see.

For the meantime it would probably be advisable to just watch. Being a cautious buyer I prefer to buy into a sure thing, once things have settled down, prices will balance out and all the related technology will be in place. Then we will be able to get a true opinion on where to put your hard earned cash.

Get the most honest and useful reviews and Dual-core processor reviews at our Desktop Computer Hardware Reviews site or get practical computer buying tips at our Computer Buying Guide site

Article Source: http://EzineArticles.com/?expert=Peter_Stewart

 

The Celeron and Pentium Processors are two of Intel's best selling CPUs. They are found in a majority of home computer systems. When comparing the two processors it should be first understood that there are different types of Pentium processors - the original Pentium all the way to the Pentium 4 (the latest Pentium processor). The Celeron processors are more or less the same, although you will find them in a wide variety of speeds.

The Intel Celeron processor was always designed to be a low-cost alternative to the Pentium processor line. It is much like a car company that offers various priced cars from the luxury sedan to the economy compact. The Celeron is simply a downgraded Pentium, that almost anyone can afford (it is essentially the compact). To begin, Celeron chips have a smaller L2 cache 9128kb compared with 512kb in the Pentium 4 Northwood, which translates into slower processing speeds. In fact, current Celerons have a clock speed limit of about 2.0GHz, where as the Pentium for is capable of speed in excess of 3.0GHz. In addition, the Pentium runs at a lower core voltage because it is more energy effecient (1.75V vs. 1.5V).

In summary, the Pentium 4 is more powerful than the most advanced Celeron processor on the market. However, Intel has planned it to be this way. Many applications will work just great with a Celeron processor, despite a little less power than the Pentium 4. It is a way to save a little cash when buying a new pc - but don't forget the saying "you get what you pay for." Celeron processors are of good Intel quality, but they will never be as good as the Pentium.

This Celeron vs. Pentium review was brought to you by SciNet Science and Technology Search Engine. SciNet is not affiliated with or specifically endorses the Celeron or Pentium processors or the manufacturer, Intel Corp. Please consult the Celeron and Pentium product information and configuration before you purchase either processor. It is also a good idea to seek other up-to-date product reviews and information as necessary.

Bradley James is a senior editor at SciNet.cc, a website containing many helpful consumer electronics review articles. For more information on Celeron and Pentium processor technology, please visit our Celeron vs Pentium webpage.

Article Source: http://EzineArticles.com/?expert=Bradley_James

 

10 Facts you should know about Universal Type Server.




1. Fits any corporate or workgroup environment—regardless of size, workflow or IT support

2. Designed from the ground-up using modern architecture for great stability and speed

3. Leverages SQL-based server and clients with state-of-the-art user interfaces (Cocoa and .NET), outstanding previews and enhanced font handling

4. Supports Mac and Windows environments, offering a great user experience regardless of platform

5. Features web-based administration. Manage your type libraries, users, and backups from anywhere—anytime

6. Includes powerful User Roles for easier administration and more granular control

7. Tracks font licenses and provides reports on usage

8. Provides a seamless transition for both Suitcase Server and Font Reserve Server customers with free migration tools and Active Directory import

9. Meets compatibility requirements: Windows (including Vista), Mac OS X (including Leopard), Adobe CS2/CS3, and QuarkXPress 6.5/7

10. Universal Type Server will be available Spring 2008

 

The primary objective of this manual is to help programmers provide software that is compatible across the family of 64-bit PowerPC™ processors, which have implemented the Vector/SIMD Multimedia Extension technology. This book describes how the vector technology relates to the 64-bit portion of the PowerPC architecture. This book supplements the "PowerPC Microprocessor Family: Programming Environments Manual for 64-bit Microprocessors".Verify in IBM Customer connect that you have the latest versions of all referenced documents before finalizing any designs. All recommendations given should be considered as guidelines, intended to help design a functional system. However, they are only guidelines and do not take the place of design specific results obtained from signal integrity modeling considerations and debug recommendations provided in this document and in referenced documents were developed to help reduce the risk of board design problems

 

AMD’s triple-core processors have been on the horizon for months now and, after all the speculation and derision, they are finally here. The launch included three Phenom X3 processors: the 8750, 8650, and 8450, all of which will come in at under $200. AMD is, as expected, positioning these processors between their dual-core and quad-core offerings and is targeting cost-conscious consumers, people who will appreciate the performance boost but would rather save a few dollars than go with a quad-core.


The three 65nm models will arrive at 2.1, 2.3, and 2.4GHz frequencies, respectively priced at $145, $165, and a hefty $195 for the 8750. These models have a TDP of 95W and 1.5MB total L2 cache per processors as well as 2MB shared cache. Also included is HT 3.0, a 1.8GHz memory controller, and Dual Dynamic Power Management. And because this is a 50 series processor we know it is a B3 revision model. They are AM2+ (940 pin) compatible so consumers won’t necessarily need new hardware to run an X3.

Having the basic information in front of you, it’s not immediately clear whether AMD is fulfilling a need that no one has, offering an interesting new option to consumers, or just making the best out of their situation (by releasing “broken” quad-cores as X3s). What we do know though is that outside of enthusiasts circles there won’t be the clamoring and complaints about the third core, rather it’ll probably be seen as nothing more (or less) than something between two other options.

AMD is also touting a platform approach–not exactly admitting that they can’t compete with Intel on a processor-by-processor basis, but rather than their entire package is better than the competition’s. Specifically, this platform is “Cartwheel”, AMD’s current take on a main-stream computer with integrated graphics. By using the 780G chipset AMD could actually produce a better system (dollar-for-dollar) than Intel, so long as you subscribe to their platform approach, something that may actually make sense considering that most sub-$200 processors are found in pre-built computers.

 

Intel is announcing a new wireless technology, called the Rural Connectivity Platform (RCP), that can send a data signal up to sixty miles at speeds up to 6.5 mbps. The sixty mile limit is imposed by the curvature of the Earth, not necessarily any limitations on the wi-fi radios involved. The setup requires two radios, or nodes. The first is positioned on the outskirts of an urban center and possesses a wired connection to the area's network infrastructure. This node then relies upon directional antennae that push the signal up to sixty miles to the receiving node, located in a remote village.


Earlier attempts to make wi-fi technology go farther than a few kilometers met with limited success. The problem lies in the way standard wi-fi radios communicate. The transmitting radio will send its data then wait a specified period of time for an acknowledgment that the data arrived successfully. When it doesn't get the acknowledgment that it requires, it retransmits its request for acknowledgment and the cycle continues. This effectively consumed the bandwidth available with acknowledgment requests. Intel's RCP technology has redefined how wi-fi radios talk to each other over long distances better defining periods where its each radio's responsibility to transmit its data.

Intel has tested the technology in India, Vietnam, Panama, and South Africa, connecting small remote villages with larger urban centers. The radios require little power, perhaps only five or six watts. This means the technology could be solar powered, an important element in potential implementation in remote areas. Connectivity to the internet with actual usable bandwidth could ignite significant leaps forward in areas such as education, science, and medicine for remote villages in many poorer countries that would otherwise go without internet.

 

Let us discuss the two main quad core computers launched by Intel and AMD. Intel was the first one to launch this computer under the name of Kentsfield, which is also called Core 2 Extreme quad-core QX6700, and it is no less than 80% faster than Conroe or Core 2 Extreme X6800.


The chips of these quad core computers are based on the latest `Core` micro architecture technology. It is not just low powered but also gives high performance. But don`t be fooled into thinking that Kentsfield is a power-saving computer by any means.

This computer with a 2.66GHz chip having a 1066MHz front-side bus (FSB) is ideally suited for those users that require heavy or highly accurate scientific calculations. It is generally used in the fields of actuarial sciences, digital content creation, financial applications and engineering analysis, like CAD. The director of Intel`s operations for digital enterprise group, Mr. Steve Smith, has claimed that this quad core computer will be 58% faster for crating digital content system, and for video, digital audio and photo editing.

To talk in simple words, Kentsfield and other such computers are not meant for ordinary people`s desk. They are more suited for high-tech workstations and desktops. It will not be wise to think that the average customer who needs a computer for general applications like word, PowerPoint, games or internet and etc., will go for a complex, complicated and expensive computer like this. For these purposes, there are many other types of products made by Intel. Apart from this, as you must have guessed by now that this high-tech computer needs a lot of power and this feature makes it practically useless for people on the go. Perhaps it will take some more research and modifications before the quad core computer becomes popular as a laptop.

Intel has launched the mainstream and commercial version of these computers. This computer is known as Core 2 Quad Q6600. It runs at 2.4GHz and it is currently priced at around $210. This budget quad-core computer is estimated to have a somewhat smaller thermal envelope, and will be at 105 watts as compared to 130 watts of the Kentsfield Core 2 Extreme quad-core QX6700.

AMD`s Quad Core computer is built on a 65mn process. They are using a new technology called Silicon-on-Insulator process. It allows faster transistors that have lower power leakage to be easily used and this unique feature helps greatly in reducing wasted heat and power. For more energy saving, each core of this computer is allowed to run at entirely different speeds or can be turned off totally with the help of the new `Enhanced Power Now` feature. This computer is also packed with an enhanced `Crossbar Switch` that enables the users to access different parts of cores at the same time. Some other vital features include the integrated memory controller along with the latest `Direct Connect Architecture 2.0` that allows much faster `Hyper Transport` speeds.

Though these features can extract exclamations of wonder and happiness from any geek but the fact remains that for the average PC buyer, this still means Latin and Greek. Definitely many more core computers using this technology are on their way and will revolutionize the computer industry but it will be some time before people embrace this technology as a part and parcel of their life. Computer makers claim that within a few years the popularity of quad core computers will triple itself.
Article Source: http://www.Free-Articles-Zone.com

 

Intel Extended Memory 64 Technology


Mostly compatible with AMD's AMD64 architecture

Introduced Spring 2004, with the Pentium 4F (D0 and later P4 steppings)

 Pentium 4F

Prescott-2M built on 0.09 µm (90 nm) process technology

2.8-3.8 GHz (model numbers 6x0)

Introduced February 20, 2005

Same features as Prescott with the addition of:-

2 MB cache

Intel 64bit

Enhanced Intel SpeedStep Technology (EIST)

Cedar Mill built on 0.065 µm (65 nm) process technology

3.0-3.6 (model numbers 6x1)

Introduced January 16, 2006

die shrink of Prescott-2M

Same features as Prescott-2M

 Pentium D

Main article: List of Intel Pentium D microprocessors

Dual-core microprocessor

No Hyper-Threading

800(4x200) MHz front side bus

Smithfield - 90 nm process technology (2.66–3.2 GHz)

Introduced May 26, 2005

2.66–3.2 GHz (model numbers 805-840)

Number of Transistors 230 million

1 MB x 2 (non-shared, 2 MB total) L2 cache

Cache coherency between cores requires communication over the FSB

Performance increase of 60% over similarly clocked Prescott

2.66 GHz (533 MHz FSB) Pentium D 805 introduced December 2005

Contains 2x Prescott dies in one package

Presler - 65 nm process technology (2.8–3.6 GHz)

Introduced January 16, 2006

2.8–3.6 GHz (model numbers 915-960)

Number of Transistors 376 million

2 MB x 2 (non-shared, 4 MB total) L2 cache

Contains 2x Cedar Mill dies in one package

 

Am2900 series (1975)


Am2901 4-bit-slice ALU (1975)

Am2902 Look-Ahead Carry Generator

Am2903 4-bit-slice ALU, with hardware multiply

Am2904 Status and Shift Control Unit

Am2905 Bus Transceiver

Am2906 Bus Transceiver with Parity

Am2907 Bus Transceiver with Parity

Am2908 Bus Transceiver with Parity

Am2909 4-bit-slice address sequencer

Am2910 12-bit address sequencer

Am2911 4-bit-slice address sequencer

Am2912 Bus Transceiver

Am2913 Priority Interrupt Expander

Am2914 Priority Interrupt Controller

 29000 (29K) (1987–95)

AMD 29000 (aka 29K) (1987)

AMD 29027 FPU

AMD 29030

AMD 29050 with on-chip FPU (1990)

AMD 292xx embedded processor

x86 architecture process

2nd source (1979–91)

(second-sourced x86 processors produced under contract with Intel)

8086

8088

Am286 (2nd-sourced 80286, so not a proper Amx86 member)

Amx86 series (1991–95)

Am386 (1991)

Am486 (1993)

Am5x86 (a 486-class µP) (1995)


K5 series (1995)

AMD K5 (SSA5/5k86)

 K6 series (1997–2001)

AMD K6 (NX686/Little Foot) (1997)

AMD K6-2 (Chompers/CXT)

AMD K6-2-P (Mobile K6-2)

AMD K6-III (Sharptooth)

AMD K6-III-P

AMD K6-2+

AMD K6-III+



K7 series (1999–2005)

Athlon (Slot A) (Argon,Pluto/Orion,Thunderbird) (1999)

Athlon (Socket A) (Thunderbird) (2000)

Duron (Spitfire,Morgan,Applebred) (2000)

Athlon MP (Palomino,Thoroughbred,Barton,Thorton) (2001)

Mobile Athlon 4 (Corvette/Mobile Palomino) (2001)

Athlon XP (Palomino,Thoroughbred (A/B),Barton,Thorton) (2001)

Mobile Athlon XP (Mobile Palomino) (2002)

Mobile Duron (Camaro/Mobile Morgan) (2002)

Sempron (Thoroughbred,Thorton,Barton) (2004)

Mobile Sempron

 

AMD chipsets

By LOVE

Before the launch of Athlon 64 processors in 2003, AMD designed chipsets for their processors spanning the K6 and K7 processor generations. The chipsets include the AMD-640, AMD-751 and the AMD-761 chipsets. The situation changed in 2003 with the release of Athlon 64 processors, and AMD chose not to further design its own chipsets for its desktop processors while opening the desktop platform to allow other firms to design chipsets. This is the "Open Platform Initiative". The initiative was proven to be a success, with many firms such as Nvidia, ATI, VIA and SiS developing their own chipset for Athlon 64 processors and later Athlon 64 X2 and Athlon 64 FX processors, including the Quad FX platform chipset from Nvidia.


The initiative went further with the release of Opteron server processors as AMD stopped the design of server chipsets in 2004 after releasing the AMD-8111 chipset, and again opened the server platform for firms to develop chipsets for Opteron processors. As of today, Nvidia and Broadcom are the sole designing firms of server chipsets for Opteron processors.
As the company completed the acquisition of ATI Technologies in 2006, the firm gained the ATI design team for chipsets which previously designed the Radeon Xpress 200 and the Radeon Xpress 3200 chipsets. AMD then renamed the chipsets for AMD processors under AMD branding (for instance, the CrossFire Xpress 3200 chipset was renamed as AMD 580X CrossFire chipset). In February 2007, AMD announced the first AMD-branded chipset since 2004 with the release of the AMD 690G chipset (previously under the development codename RS690), targeted at mainstream IGP computing. It was the industry's first to implement a HDMI 1.2 port on motherboards, shipping for more than a million units. While ATI had aimed at releasing an Intel IGP chipset, the plan was scrapped and the inventories of Radeon Xpress 1250 (codenamed RS600, sold under ATI brand) was sold to two OEMs, Abit and AsRock. Although AMD states the firm will still produce Intel chipsets, Intel had not granted the license of 1333 MHz FSB to ATI. Considering the rivalry between AMD and Intel, AMD is less likely to release more Intel chipset designs in the foreseeable future.

On November 15, 2007, AMD has announced a new chipset series portfolio, the AMD 7-Series chipsets, covering from enthusiast multi-graphics segment to value IGP segment, to replace the AMD 480/570/580 chipsets and AMD 690 series chipsets. Marking AMD's first enthusiast multi-graphics chipset. Discrete graphics chipsets were launched on November 15, 2007 as part of the codenamed Spider desktop platform, and IGP chipsets were launched at a later time in Spring 2008 as part of the codenamed Cartwheel platform.

AMD will also return to the server chipsets market with the next-generation AMD 800S series server chipsets, scheduled to be released in 2009 timeframe.

 

AMD's first completely in-house x86 processor was the K5 which was launched in 1996. The "K" was a reference to "Kryptonite", which from comic book lore, was the only substance that could harm Superman, with a clear reference to Intel, which dominated in the market at the time, as "Superman".


In 1996, AMD purchased NexGen specifically for the rights to their Nx series of x86-compatible processors. AMD gave the NexGen design team their own building, left them alone, and gave them time and money to rework the Nx686. The result was the K6 processor, introduced in 1997.

The K7 was AMD's seventh generation x86 processor, making its debut on June 23, 1999, under the brand name Athlon. On October 9, 2001 the Athlon XP was released, followed by the Athlon XP with 512KB L2 Cache on February 10, 2003.

Athlon 64, Opteron and Phenom

Quad-core "Barcelona" die-shot.Main articles: Athlon 64, Opteron, and Phenom (processor)

The K8 was a major revision of the K7 architecture, with the most notable features being the addition of a 64-bit extension to the x86 instruction set (officially called AMD64), the incorporation of an on-chip memory controller, and the implementation of an extremely high performance point-to-point interconnect called HyperTransport, as part of the Direct Connect Architecture. The technology was initially launched as the Opteron server-oriented processor. Shortly thereafter it was incorporated into a product for desktop PCs, branded Athlon 64.

AMD released the first dual core Opteron, an x86-based server CPU, on April 21, 2005. The first desktop-based dual core processor family — the Athlon 64 X2 — came a month later.[10] In early May 2007, AMD had abandoned the string "64" in its dual-core desktop product branding, becoming Athlon X2, downplaying the significance of 64-bit computing in its processors while upcoming updates involves some of the improvements to the microarchitecture, and a shift of target market from mainstream desktop systems to value dual-core desktop systems. AMD has also started to release dual-core Sempron processors in early 2008 exclusively in China, branded as Sempron 2000 series, with lower HyperTransport speed and smaller L2 cache, thus the firm completes its dual-core product portfolio for each market segment.

The latest AMD microprocessor architecture, known as K10, became the successor to the K8 microarchitecture. The first processors released on this architecture were introduced on September 10, 2007 consisting of nine quad-core Third Generation Opteron processors. This was followed by the Phenom processor for desktop. K10 processors will come in dual, triple-core,and quad-core versions with all cores on one single die.

 

Advanced Micro Devices, Inc. (AMD) (NYSE: AMD) is an American multinational semiconductor company based in Sunnyvale, California, that develops computer processors and related technologies for commercial and consumer markets. Its main products include microprocessors, motherboard chipsets, embedded processors and graphics processors for servers, workstations and personal computers, and processor technologies for handheld devices, digital television, and game consoles.


AMD is the second-largest global supplier of microprocessors based on the x86 architecture after Intel Corporation, and the third-largest supplier of graphics processing units. It also owns 21 percent of Spansion, a supplier of non-volatile flash memory. In 2007, AMD ranked eleventh among semiconductor manufacturers

 

Despite the ultimate importance of the microprocessor, the 4004 and its successors the 8008 and the 8080 were never major revenue contributors at Intel. As the next processor, the 8086 (and its variant the 8088) was completed in 1978, Intel embarked on a major marketing and sales campaign for that chip nicknamed "Operation Crush", and intended to win as many customers for the processor as possible. One design win was the newly-created IBM PC division, though the importance of this was not fully realized at the time.




IBM introduced its personal computer in 1981, and it was rapidly successful. In 1982, Intel created the 80286 microprocessor, which, two years later, was used in the IBM PC/AT. Compaq, the first IBM PC "clone" manufacturer, in 1985 produced a desktop system based on the faster 80286 processor and in 1986 quickly followed with the first 80386-based system, beating IBM and establishing a competitive market for PC-compatible systems and setting up Intel as a key component supplier.



In 1975 the company had started a project to develop a highly-advanced 32-bit microprocessor, finally released in 1981 as the Intel iAPX 432. The project was too ambitious and the processor was never able to meet its performance objectives, and it failed in the marketplace. Intel extended the x86 architecture to 32 bits instead

 

The computer hardware that contains the host controller and the root hub has an interface geared toward the programmer which is called Host Controller Device (HCD) and is defined by the hardware implementer.


In the version 1.x age, there were two competing HCD implementations, Open Host Controller Interface (OHCI) and Universal Host Controller Interface (UHCI). OHCI was developed by Compaq, Microsoft and National Semiconductor; UHCI was by Intel.

A typical USB connector.VIA Technologies licensed the UHCI standard from Intel; all other chipset implementers use OHCI. UHCI is more software-driven, making UHCI slightly more processor-intensive than OHCI but cheaper to implement. The dueling implementations forced operating system vendors and hardware vendors to develop and test on both implementations, which increased cost.

HCD standards are out of the USB specification's scope, and the USB specification does not specify any HCD interfaces. In other words, USB defines the format of data transfer through the port, but not the system by which the USB hardware communicates with the computer it sits in.

During the design phase of USB 2.0, the USB-IF insisted on only one implementation. The USB 2.0 HCD implementation is called the Enhanced Host Controller Interface (EHCI). Only EHCI can support hi-speed (480 Mbit/s) transfers. Most of PCI-based EHCI controllers contain other HCD implementations called 'companion host controller' to support Full Speed (12 Mbit/s) and Low Speed (1.5 Mbit/s) devices. The virtual HCD on Intel and VIA EHCI controllers are UHCI. All other vendors use virtual OHCI controllers.

 

Universal Serial Bus (USB) is an input/output port standard for computers and digital equipment, which allows easy transfer of data via a direct connection or cable. The original USB standard version 1.1 was superceded by USB 2.0, also known as hi-speed USB. A hi-speed USB host controller refers to the hardware inside the computer that provides hi-speed USB functionality to the ports.



USB first hit the market in November 1995, but the new standard had compatibility problems. These bugs were addressed and the subsequent USB version is now referred to as “original” USB 1.1. The data transfer rate (DTR) for USB 1.1 was impressive at 12 megabits per second (mbps), and was intended to replace slower parallel and serial ports for peripheral devices. The first devices to be widely adopted for USB ports were keyboards and mice. Printers, scanners, external tape drives and other devices followed.

As demands for faster data transfer increased, a newer version of USB answered the call. USB 2.0 boasts maximum data rates of 480 mbps, 40x faster than original USB. Computers that supported the old standard required a new hi-speed USB host controller to take advantage of the faster speeds. Devices made for the new 2.0 standard, such as memory sticks and digital cameras, would default to the old, slower transfer speeds if plugged into a computer with a USB 1.1 controller installed.

A hi-speed USB host controller is built into modern computers, while older computers can be updated with the hardware. A controller is inexpensive and can be purchased anywhere computers are sold. The hi-speed USB host controller is a card that easily installs into any available slot in the motherboard. The back-facing plate of the controller provides two or more hi-speed USB ports.

To take advantage of USB 2.0 speeds, both the computer and the device plugged into the USB port must both support the 2.0 standard. A hi-speed USB host controller cannot make a USB 1.1 device operate at 2.0 speeds. Hi-speed controllers are backwards compatible, however, falling back to the slower 1.1 standard for devices that require it.

 

USB Ports

By LOVE

Just about any computer that you buy today comes with one or more Universal Serial Bus connectors on the back. These USB connectors let you attach everything from mice to printers to your computer quickly and easily. The operating system supports USB as well, so the installation of the device drivers is quick and easy, too. Compared to other ways of connecting devices to your computer (including parallel ports, serial ports and special cards that you install inside the computer's case), USB devices are incredibly simple!


In this article, we will look at USB ports from both a user and a technical standpoint. You will learn why the USB system is so flexible and how it is able to support so many devices so easily -- it's truly an amazing system!

Anyone who has been around computers for more than two or three years knows the problem that the Universal Serial Bus is trying to solve -- in the past, connecting devices to computers has been a real headache!

Printers connected to parallel printer ports, and most computers only came with one. Things like Zip drives, which need a high-speed connection into the computer, would use the parallel port as well, often with limited success and not much speed.

Modems used the serial port, but so did some printers and a variety of odd things like Palm Pilots and digital cameras. Most computers have at most two serial ports, and they are very slow in most cases.

Devices that needed faster connections came with their own cards, which had to fit in a card slot inside the computer's case. Unfortunately, the number of card slots is limited and you needed a Ph.D. to install the software for some of the cards.

The goal of USB is to end all of these headaches. The Universal Serial Bus gives you a single, standardized, easy-to-use way to connect up to 127 devices to a computer.

 

Desktop computers, a type of micro computer, fit on a desktop and are used widely in offices and home. Personal computers or home computers, workstations, internet servers and special communications computers are four types of desktop computers used. Desktop computers are widely used in household, schools, business as these computers are very cheap.
The desktop computers are normally modular and its components can be easily upgraded or replaced. These are available in elegant case styles. They are used for carrying various tasks like organizing digital photos, office tasks, editing of video and accessing Internet.

Micro Instrumentation Telemetry System (MITS) 8800 offered the first desktop type system in 1975. The launch of this variety of computer encouraged scores of other companies to produce personal computers. In 1977, Tandy Corporation (Radio Shack) launched its model of personal computers having a keyboard and CRT. In the same year the Commodore PET and Apple II were also released market and these are forerunners of today`s Desktop computers.

Introduction of IBM PC, in 1981 by IBM, was a milestone in the field of Personal Computer. Based on Intel`s 8088 microprocessor, it became a success overnight. The introduction of a 16 bit microprocessor paved way for more powerful and faster micros. Also standardization in computer industry could be made as IBM PC used an operating system which was available to all the computer manufacturers. Apple Mac PC using Motorola 68000 is another series of 32 bit popular personal computers launched by Apple in 1984.

A modern desktop computer consists of Display Motherboard, CPU, Primary storage (RAM), Expansion cards, Power supply, Optical disc drive, Secondary Storage(HDD), Keyboard, Mouse.

All the desktop computers come with ports which allow plugging different external devices into the computer viz, keyboards, monitors, scanners, printers.The different type of ports are Universal serial bus, Ethernet, Modem, Headphone, Serial, Parallel, PS/2, VGA, Power connection, Fire wire, and Card reader.

You should surf the internet to check for detail guides available before purchasing a desktop computer. A few important points are listed here which may help you to have an informed purchasing decision:

Processors (CPUs): It is a very difficult choice to make between an Intel processor and an AMD. The main difference comes in the relative speed and number of cores in the processor. You should refer for the detail information on this in the internet sites.

Memory (RAM): It is best to have at least 1 GB memory system and older DDR memory standard should be avoided. For better performance, faster memories are must and ensure that future upgrades of memory are possible.

Hard Drives: 250 GB or more storage space is best to have in present days. Serial ATA interface is used in most drives now for ease of installation.

Optical Drives (CD/DVD): Multiformat DVD burner which can support both +R/RW and -R/RW with a 16x recordable speed are best to have in desktop computers.

Video Cards: Integrated graphics is sufficient if you are not doing 3D graphics. The important things which you should consider are the memory capacity of the card, version of Direct X supported, output connectors and the performance. For the purpose of games a Direct x 10 card with 256 MB memory should be considered.

External Connectors: Instead of internal cards, external interfaces are preferred now for various upgrades and peripherals to computers. IEEE 1394 or FireWire points and USB 2.0, both should be present in a desktop computer.

Monitors: These days LCDs are more popular than CRTs. This is because they have less power consumption and have reduced size. Also traditional aspect ratio of 4:3 is being replaced by wider display screen and prices of 20-22 inch models are decreasing.

Rate this Article:
 
Article Tags: Computer, Laptop, Notebook, Computador



Article Source: http://www.articlesbase.com/hardware-articles/desktop-computer-know-it-better-346034.html

 

The main application of a good network is sharing. The sharing can be of any hardware, software or peripheral devices that are very costly and/ or are very less in number.



Networking is an interconnected collection of autonomous computers, allowing all the users controlled access, in a cost effective manner. The main application of a good network is sharing. The sharing can be of any hardware, software or peripheral devices that are very costly and/ or are very less in number.


With networking, you can access remote database and communication facilities. For a smooth functioning, a business is required to have a suitable networking support. There are various types of Network infrastructure services, such as LAN, MAN, and WAN. All small business-network solutions have LAN as their base. A LAN or local area network covers only a small area, which may be a office or a building. The main purpose of the business networking solutions is to serve its users in resource sharing.

A Metropolitan Area Network (MAN) is spread across an entire city. For example, a cable TV network. The purpose of this kind of business networking solutions is also to facilitate the sharing of hardware and resources among its users. Another Network Infrastructure Design available is WAN or Wide Area Network, which is a group of computers that are separated by large distances and tied together. The basic Network Security and Support Services, keeping LAN, MAN and WAN in view, are modem and robust.


The Network installation services, in general, use both branded as well as non-branded, equipments, including products of some of the best companies like IBM, Compaq, Apple, and Toshiba. However, if you are low on budget, you can also choose from used and refurbished products.

There are also various network-monitoring services offered by the network provider, wherein a remote network monitoring service keep an eye on your network and ensures its smooth working. There are various specific network protocols designed to implement better safety within a client-server business network, to attain better network security and privacy. Qualified network professionals monitor for any spiteful activity under the purview of these business networking solutions.

Effective network installation services helps in setting up network quickly and effectively, in case of relocation of existing networks.

There are companies that provide comprehensive network security and support services, to ensure proper functioning of the network by their prompt troubleshooting services.



While companies who can afford keep their own qualified networking support for throughout maintenance and monitoring of the networking infrastructure, small businesses with limited budget prefer to outsource the services on an AMC contract or akin.

About Author

Smit Mathur is an expert for writting Articles and is currently working for Swift Computers. For more information related to networking solutions, network Installation service, pc support, computer support please visit http://www.swiftcomputers.com.au/

 

Anonymous Proxy

By LOVE

An anonymous proxy, also referred to as an anonymous proxy server, allows a client to access a file, web page, or some other resources through a server which services the requests of the client through another remote server.


An anonymous proxy, also referred to as an anonymous proxy server, allows a client to access a file, web page, or some other resources through a server which services the requests of the client through another remote server. For example, when a client accesses a web page through an anonymous proxy, the client talks to the proxy and the proxy talks to the web page, maintaining the privacy of the client, such as his/her IP address. The purpose of such a proxy, or a server, is to protect the privacy of the client from the service and from other individuals who may be logging and inspecting the client’s connection.


Such proxies are commonly used in schools and workspaces, to pass-by potential firewalls and monitoring services in place. Students will often utilize anonymous proxies to access social networking websites deemed and blocked by the school as harming the productivity of the students. Employees of a company may try to circumvent forms of monitoring within a company that may try to track or control which websites its employees are visiting.


Anonymous proxies serve as a wall between the client and the service being accessed. These servers can be used to bypass the restrictions and access these services possibly blocked by the country or some other organization providing the Internet connection, while others may use it solely for the privacy that is possibly guaranteed.



Risks Behind Anonymous Proxies

Because of how anonymous proxies, especially those running on web pages, are designed, all data sent to the proxy servers are unencrypted. Therefore, it is possible that confidential information such as logins and passwords can be recorded by a malicious proxy server. Also, through proxy chaining, some clients could potentially fall as victims to a web page displaying a false security measures, allowing all proxies within these chains to trace the client’s activities. Thus, only trusted anonymous proxies with a clear privacy policy should be used for security sake.

About Author

Martha Thames writes on topics such as Anonymous Proxy , Anonymous Proxy List and Proxy Browser for The Tech FAQ.



Article Source: http://www.1888articles.com/author-robert-thomson-5539.html

 

Sponsored Ads

ARCHIVES