Saturday, December 26, 2009

Stolen Berkeley Heights computer logs into Facebook, leads to arrest

BERKELEY HEIGHTS -- If you’re going to keep a stolen computer, don’t use it to update your Facebook status.

Police in Berkeley Heights used LoJack and Facebook to track a computer stolen in a burglary there to Passaic. In March, a Dell laptop computer valued at $1,272 was stolen from a home on Ridge Drive in Berkeley Heights, according to Sgt. Frank Fortunato, the department’s spokesman.

Police notified employees at LoJack, a radio transmitting device used to recover vehicles and electronics, to activate a tracking device installed in the computer.

The computer stayed dormant until late September, when police began receiving signals that LoJack had identified a location, the residence of Dayron Johnson, 35, in Passaic.

“We were able to see that he was logging onto his Facebook page and AOL screen name,” said Detective Sgt. Michael Mathis, of the Berkeley Heights Police Department.

Mathis said there was nothing incriminating written on Johnson’s Facebook page, but investigators took the information to the Union County Prosecutor and obtained a warrant.

Some burglaries in Mountainside were similar to the one in Berkeley Heights in which the computer was taken - front door entry and small, expensive goods taken inside a pillowcase - so police from the two communities executed the search in case other evidence was found.

“We walked in and saw the computer sitting on a small desk in the dining room,” Mathis said. “It was turned on and his Facebook page was on the screen.”

Mathis said there was not enough evidence to charge Johnson with the burglary, but police charged him with possessing stolen evidence. Johnson is currently in Union County Jail, being held on $75,000 bail.

“In this case, Facebook helped with identifying and painting a picture of our suspect,” Mathis said. “But, it was LoJack that helped."

Thursday, December 24, 2009

DHB Computer Hit with Malicious Virus

While being upgraded on Thursday, the computer system of the Waikato District Health Board was attacked by a malicious virus, which led to the entire system's shut down.

It has been reported that hospital staff across the Waikato DHB area immediately shut down about 3000 computers when they were attacked by the Conficker virus, and parts of the system were able to be put back into use last night.

Calling the attack "random" and a part of the ongoing appearance of Malware", Health Ministry's Deputy Director General of Health Information Directorate, Alan Hesketh, shared that the board has previously managed to defend itself against such attacks on numerous occasions.

"When they were going through this upgrade, clearly some vulnerability was exposed, and taken advantage of. The particular worm they've got is out there all the time. We see attempts to get into our (IT) environment at least five or six times a week. We're concerned about it, and that's why all the DHBs have controls in place", Mr. Hesketh.

Despite the fact that all the board's systems were connected, there have been no reports of the virus attacking any other IT system, which has led the organization to heave a sigh of relief.

Sunday, December 13, 2009

Microsoft Dismisses BitLocker Threat

Microsoft claims recent Internet reports about vulnerabilities in its BitLocker security technology are exaggerated.
"Success comes at a price," wrote Microsoft senior director Paul Cooke, in a blog post Monday. That price, Cooke wrote, includes "greater scrutiny and misinterpretation of some of the technologies. One of those technologies is BitLocker," he said.

BitLocker is a drive encryption system that Microsoft introduced in 2007 with the introduction of Windows Vista. It's also included in some versions of the new Windows 7 operating system, which debuted in October.

Security bloggers, including researchers at Germany's Fraunhofer Institute for Secure Information Technology, in recent days have published reports that PCs and laptops protected with BitLocker could be compromised in certain circumstances.

But Cooke said those circumstances covered scenarios that were highly unlikely to occur in real life.

"This research is similar to other published attacks where the computer owner leaves a computer unattended in a hotel room and anyone with access to the room could tamper with the computer," wrote Cooke.

"This sort of attack poses a relatively low risk to folks who use BitLocker in the real world," he said.

Still, Cooke reminded Windows users that BitLocker is only one element of Microsoft's multi-tiered approach to security.

"Even with the great enhancements made in Windows 7 such as BitLocker To Go, it still remains that BitLocker alone is not a complete security solution," said Cooke.

"IT professionals as well as users must be diligent when protecting IT resources and the best protection against these sorts of targeted attacks requires more than just technology. It requires end user education and physical security also play important roles," Cooke wrote.

Tuesday, December 1, 2009

Making the first computer virus

If you've ever had to spend a lot of money on antivirus software, you'd be forgiven for wanting to take Dr Fred Cohen aside for, to put it politely, a few choice words.

But although Dr Cohen is responsible for creating the first ever computer virus some 26 years ago, his pioneering research has in fact led the way in protecting computers from the threats that surfaced in the years to come.

He told BBC World Service's Witness programme about the day he made the discovery while studying at the University of Southern California.

After a neighbouring university created a Trojan horse - which allowed hackers to gain access to a machine - Dr Cohen realised that the Trojan could be programmed to duplicate itself.

This is the proverbial lightbulb going off.

"I was sitting there in the class and all of a sudden it dawned on me that if that Trojan horse copied itself into other programs, then all those programs would be infected, and then everybody that ran any of those programs would get infected and so forth.

"It was at that point immediately obvious that it was game over."

He discussed the idea with Dr Len Adleman, another computer security expert at the university.

"Fred approached me and said he had this new type of computer security threat, and he began to describe what we now call viruses," recalled Dr Adleman.

"He wanted to run some actual experiments, in particular on the computer that I used.

"There was no point in running an experiment, since it was so obvious that it was going to work."

However, Dr Cohen insisted they make sure - and the first computer virus was born.

"In that moment, I pretty much understood the bad news.

"I spent the next five or six years of my life trying to find ways to protect against it and understanding the limits of what could ever be done."

Ethical dilemma

Armed with their new discovery, the pair faced a problem.

It had the potential to have a massive negative impact on the computing world. As academics, did they have an obligation to share their findings or should the vulnerability be kept secret?

They decided to publish the paper.

"If we told people about computer viruses, they could potentially protect themselves," said Dr Adleman.

"It was also at least my impression that computer viruses were inevitable, and were going to arrive whether Fred published or not.

"In the end we decided to publish, but to not make the code that Fred put in his paper so explicit that an amateur could take it and produce computer viruses."

Dr Cohen agreed.

"This was going to happen one way or another. The real question was is it going to happen after somebody's done the research, and figured out what to do about it, or is it going to happen before the research is done - and then we're really in trouble."

Dr Cohen believes that genuine research into possible threats has not happened for quite some time.

"As far as I can tell, somewhere around the late eighties or early nineties was the end of the real research related to computer viruses.

"There are businesses that want to make sure they keep making money by having cures that fix the last one, but not the next one."

Tuesday, November 24, 2009

Open source comes to Army Go Mobile program

The Army’s Maj. Keith Parker visited the GCN Lab this week to show how the Go Mobile program interfaces with the Army Knowledge Online (AKO) system .

It was an interesting meeting, with Parker explaining how the Army was using cell phones that soldiers were already carrying to create a secure network comprised of the Redfly companion, which is basically a dummy terminal without the cell phone brain, and a bunch of other things such as printers, projectors, goggles and even solar-powered charging kits.

As a technology guy, I was impressed with the types of devices that were shown, especially the tiny projector. But what impressed me more than anything was that every single application running in the Go Mobile Program was open source. Here we have the Army relying on open-source programs for a major network. Back when I was starting out at GCN in 1996 as a reporter, and later on in the lab around 1998, everything that I covered was pretty much custom-built for the military. I once attended a military simulation conference and they were showing huge computers that were built just for the military alongside some commercial off the shelf (COTS) equipment that was just coming into style.

The COTS equipment saved the military money because they were using commercial software to create military applications. But using open-source software is even better, because it’s free, assuming you can get some good applications built.

Parker said the troops themselves initiate a lot of the applications. Recruits going into the Army today are all very technically savvy, and many of them can even write advanced code, he said. With the military asking soldiers to contribute ideas for the AKO program, this just plays into that plan if soldiers can both suggest applications and also write them.

Moving from million-dollar customized applications to COTS was a big step. But as much ink as GCN has devoted to COTS over the years, this could be an even bigger move. When rank-and-file solders are able to write the very applications they are using, it kind of gives a new meaning to the Army of One. And it makes you respect our soldiers even more.

Sunday, November 8, 2009

Google co-founder Sergey Brin wants more computers in schools

High school dropout Sergey Brin has a few ideas on how the educational system should be improved. Not surprisingly from a guy who co-founded Google, where he still serves as president of technology and one of the company's three key decision-makers, a lot of those ideas center on computers.

"It's important for students to be put in touch with real-world problems," Brin said. "The curriculum should include computer science. Mathematics should include statistics. The curriculums should really adjust."

He advocated putting all textbooks on computers, to make for easier access, and for putting high school students to work -- writing Wikipedia articles, and teaching technology to senior citizens and middle school students. In teaching, they will learn.

Brin spoke today at a conference on Google's campus, Breakthrough Learning in the Digital Age, which the tech company is co-hosting with Common Sense Media and the Joan Ganz Cooney Center at Sesame Workshop. By and large, speakers passionately spoke of the advantages of equipping schools with the latest in digital technology, and of engaging students on their home turf -- computers.

Google has been relatively quiet in the field of education, but the company is starting to make a splash. For the last three years, it has given schools the premium version of its Google Apps, enabling schools to run their business and provide teachers with e-mail and other tools that it typically charges corporations for. In part, the giveaway helps advance Google's plan of...

...providing universal access to all the world's information; in part it helps prepare the workforce of tomorrow; and it also is indoctrinating that workforce with the Google brand.

"The kids who are in school are our future business leaders," said Cristin Frodella, product marketing manager for the Google Apps, Education Edition. "If they like Google Apps now, they'll ask for it by name. There is a value there."

The presence of Brin at the conference, as well as Google Chief Executive Eric Schmidt and company vice president Marissa Mayer, speaks volumes to the company's commitment to education, said Jim Steyer, CEO of Common Sense Media, an advocacy group. "It's a very positive symbolic role," Steyer said. "Google is serious about helping kids, particularly disadvantaged kids."

Brin, wearing some funky new Vibram FiveFingers shoes that fit the feet like a glove, told how his family enrolled him in a Montessori school from age 6 to 11, where he was able to explore his own interests in learning. "The school had an Apple II," said Brin, now 36. "When I was 9, my parents gave me a Commodore 64, which was fun. At the time, the opportunity to program your own computer was easier than it is today. Today there are significantly larger barriers because of the complexity built into computing."

After he left the Montessori school, Brin felt he was stuck in a 19th-century curriculum, and he ultimately quit high school after his junior year. He remains on leave from Stanford, where he was working on his doctorate when he and Larry Page hit upon the algorithm that led to Google, and turned them both into billionaires.

Brin had some other ideas for improving schools, most notably treating teachers better. His mother-in-law, Esther Wojcicki, who spoke at the conference, teaches English and journalism at Palo Alto High School, and many of his friends are teachers. "It's really a miserable job," he said. "They're not really paid a living wage."

Brin foresees computers getting cheaper and cheaper, and broadband access becoming more ubiquitous, which will make computers more a part of education than ever. A relatively new parent, Brin was asked by moderator James Bennet, editor of Atlantic magazine, what kind of technological world he envisions 15 or 20 years from now.

Brin said he hoped that the increasingly powerful access to information would free people up to become more capable individuals. But he did see a downside.

"When I was growing up, I always knew I'd be in the top of my class in math, and that gave me a lot of self-confidence," he said. But now that studens can see beyond their own school or hometown, they see that "there are always going to be a million people better than you at times, or someone will always be far better than you. I feel there's an existential angst among young people. I didn't have that. They see enormous mountains, where I only saw one little hill to climb."

Wednesday, November 4, 2009

The computer engineer who thinks we're doomed

It was a fullish moon when I picked up a new book called "The Lights in the Tunnel," thinking that the title was sure to lift my spirits on All Souls Day.

Perhaps I should have picked me up some Dostoyevsky.

It's not that "The Lights in the Tunnel" isn't thoughtful or interesting. The author, Martin Ford, is a computer engineer who has clearly spent many hours considering the true effects of technology on society.

It's just that a rough summation of those effects might be described as "really bloody terrible."

Essentially, he believes that technology is the direct cause of job losses that will never return. In fact, his fear is that even in those industries that are currently still labor intensive, job losses are inevitable. Which just might mean that there will be vast numbers of people all over the world who will have no money to spend at Zara. Not even at Old Navy.

Naturally, Ford has found himself in a spirited debate with economists who seem to think his arguments border on loonism.

A chap named Robin Hanson seems rather hurt that Ford isn't in the thrall of economists' thinking--you know, the optimistic stuff about how technology will always produce more jobs and more wealth because we humans are, well, so clever.

Perhaps I paraphrase a touch, but economists such as Hanson tend to believe that economic inequality might be a politically difficult thing, but it doesn't portend economic disaster: because, as Hanson says, "producers can focus on giving the rich what they want, and innovation and growth is just as feasible for elite products as for mass products."

Now of course, I'm not going to argue with economists about human behavior because it's generally akin to arguing with a hockey color commentator about creme caramel.

However, Ford, the techie whom economists dismiss, has a very interesting solution to his rather bleak human scenario. He seems rather keen on a consumption tax, or a direct tax on business that would attempt to capture the income that people would have earned if they had had a job. Then he would incentivize the unemployed to contribute to society according to their own talents and society's needs.

You need a strong heart and stomach to read Ford's book, but some small part of me cannot help but wonder whether his rather miserable prognostication might have some truth to it.

"Glenn Beck would scream," Ford told me in an e-mail. Which made me immediately wonder why his publishers hadn't put that quote on the book cover.

Strangely, Ford isn't some sandal-wearing socialist wagging his finger at the money lenders.

"Capitalism has worked out fairly well for me, and I'd like to keep it around. If the ideas in the book are correct, then I really wonder if the system will be sustainable without some type of intervention," he told me.

Here is a computer engineer who's genuinely worried about, well, human beings.

"If that underclass increases relentlessly over time, and if you start seeing more educated people getting dragged into it, then we are going to have a huge problem. I think that may happen as machines and computers keep getting better until eventually they can do the jobs of even people with lots of education and training. At that point I think you have to do something," he added.

Unfortunately, the history of the world doesn't necessarily offer too much hope for the implementation of the kind of intervention that Ford is suggesting.

So one day, you, me, Ben Affleck, Bruce Willis, Billy Bob Thornton, and Liv Tyler might be seated in a devastated landscape muttering: "How were we to know we were supposed to listen to bloody Martin Ford? He was just some computer engineer."

Thursday, October 29, 2009

BlackBag Technologies Announces SoftBlock Write-Blocking Solution for Mac

BlackBag Technologies, Inc. today announced the release and general availability of SoftBlock, the Company's much anticipated write-blocking software application that Mac forensic examiners can easily use to mount and manage devices containing suspect data in a forensically sound manner...

With a uniquely light footprint, the kernel-based utility allows for constant use on shared or single-user systems. SoftBlock identifies devices upon connection and mounts them, based on user selection as read-only or as read-write. BlackBag's new application offers unmatched features, including:

-- Mobility: Once installed on a computer, SoftBlock is always readily
available for use, which means one less piece of equipment to manage while
on the go.
-- Speed: SoftBlock enables device mounting and management directly from
the user's computer. This allows a quick preview before further analysis,
expediting investigation time.
-- Affordability: With no capacity constraints and seamless integration
into the examiner's workflow, SoftBlock costs significantly less than the
multiple hardware blockers required to manage the same load.
-- Security: Safeguards such as "Always On" protection and double
confirmation in mounting new devices read-write ensure evidence is
preserved intact.


"BlackBag's release of SoftBlock stands as yet another example of our commitment to innovation within the Mac forensic field," said Derrick Donnelly, chief technology officer at BlackBag Technologies. "Driven directly by the needs of today's Mac examiners, SoftBlock focuses on providing consistent protection for mounted devices, as well as unparalleled convenience and simplicity for the examiner. BlackBag is pleased to unveil SoftBlock as an affordable, easy tool for use by both corporate and law enforcement Mac forensic examiners."

SoftBlock is available for $239. For more information on BlackBag's SoftBlock write-blocking software application, visit www.blackbagtech.com.

About BlackBag Technologies

BlackBag Technologies, Inc. provides Mac-based data forensic and eDiscovery solutions to law enforcement and private sector clients. Based in Silicon Valley, BlackBag offers clients a comprehensive and secure suite of services, software and training solutions. The Company acknowledges the growing challenges faced by forensic examiners and legal professionals in the field and is dedicated to creating flexible, open environment solutions. BlackBag serves a wide range of clients including federal, state and local law enforcement agencies, as well as leading private sector security, legal and human resource professionals. Visit www.blackbagtech.com.

Tuesday, October 27, 2009

Acer Aspire M5800-U5802A and eMachines EZ1601-01 Nettop

This is the second part of a two-part series discussing three different desktop computers: the Dell Studio XPS 435, Acer Aspire M5800-U5802A, and eMachines EZ1601-01 Nettop. Let's pick up where we left off, discussing the features of the Acer Aspire M5800-U5802A desktop.


Acer Aspire M5800-U5802A desktop

If the manufacturer's configuration does not suit your needs, you are a bit limited as to the improvements you can make on your own to the Acer Aspire M5800-U5802A. There are two vacant hard drive bays you can use, as well as one 1x PCI Express mini slot if you want to upgrade your graphics card.

As stated before, the 8GB of RAM is the maximum you can get on the Aspire M5800-U5802A, as all four of its RAM slots are occupied. This limited upgrade capability might be a downer for some, but you have to take into consideration the low price you are paying for such a system, and beggars can't be choosers.

Unfortunately, the M5800-U5802A does not have a Blu-ray drive like some of its competitors. This can be fixed, though, by buying a separate one yourself and installing it. Once finished with that, you connect the M5800-U5802A to a LCD or High Def television via the HDMI port.

Besides HDMI, you also get a DVI and a VGA port. Additionally, you get FireWire, ten USB ports, headphone and microphone jacks, and more. There is no eSATA port available on the M5800-U5802A. But there is a card reader that supports several different types of memory cards.

Monday, October 26, 2009

Small Is The New Big

Why it’s Breakthrough: It’s a fingernail-sized computer chip that can hold 1 terabyte, 1 trillion bytes of data, or 50 times the capacity of today’s best silicon-based chip technologies.



The Story: Engineers from North Carolina State University have unveiled what they call a nanostructured Ni-MgO system that can store more data than any one chip ever has by far. Plus, with its miniscule size, the implications are impressive. As it is, the chip can hold up to 20 high-definition DVDs or 250 million pages of text, “far exceeding the storage capacities of today’s computer memory systems,” according to a press release from North Carolina State University.

Dr. Jagdish “Jay” Narayan, director of the National Science Foundation Center for Advanced Materials and Smart Structures at the university, has spearheaded the effort behind this breakthrough, though his team’s breakthrough relies on the process of selective doping, where an impurity is added to a material whose properties consequently change. Their work begins at the nanoscale level, where the engineers added metal nickel to magnesium oxide, a ceramic. This addition contained clusters of nickel atoms no bigger than 10 square nanometers, just for perspective, a pinhead has a diameter of 1 million nanometers. The result netted a 90% reduction in the size of the chip, yet with an a vastly enhanced capacity to hold more data.

“Instead of making a chip that stores 20 gigabytes, you have one that can handle one terabyte, or 50 times more data,” Dr. Narayan said.

The process also shows promise for boosting vehicles’ fuel economy and reducing heat produced by semiconductors, a potentially important development for more efficient energy production. By using the process of selective doping, the engineers could introduce metallic properties into ceramics, Narayan said. The process would allow them to develop a new generation of ceramic engines able to withstand twice the temperatures of normal engines. The engines could potentially achieve fuel economy of 80 miles per gallon, Narayan said.

And, since the thermal conductivity of the material would be improved, the technique could also have applications in harnessing alternative energy sources like solar energy.

The breakthrough using the process of selective doping also advances knowledge in the emerging field of “spintronics,” which is dedicated to harnessing energy produced by the spinning of electrons.

“Most energy used today is harnessed through the movement of current and is limited by the amount of heat that it produces, but the energy created by the spinning of electrons produces no heat,” the university stated in a press release.

The engineers manipulated the nanomaterial so the electrons’ spin within the material could be controlled, which could prove valuable to harnessing the electrons’ energy. The finding could be important for engineers working to produce more efficient semiconductors.

Sunday, October 25, 2009

Oct. 23, 1995: First Computer-Network Wiretap

1995: A federal judge for the first time authorizes a wiretap of a computer network. It leads to hacking charges against a young Argentine for breaking into sensitive U.S. government sites.
reno_f

Attorney General Janet Reno worried in 1995 about hacking turning the internet into the "Wild West of the 21st Century.
Dennis Cook/AP Photo

Arrested and later extradited to the United States was Julio Cesar Ardita, who was 21 at the time. His online name was “griton” — Spanish for “screamer.” The hacks, using a dial-up modem, were traced to his parents’ Buenos Aires apartment, located near the university where Ardita was studying computer science.

U.S. authorities said he first accessed a system at Harvard University’s Faculty of Arts and Sciences. Using a sniffer, he obtained passwords as users accessed other systems.

He then used this information to breach those systems, and continued the process to obtain access to computers at the Defense Department, Caltech, Northeastern University, the University of Massachusetts, NASA’s Jet Propulsion Laboratory, NASA Ames Research Center, the Naval Research Laboratory and the Naval Command Control and Ocean Surveillance Center — as well as systems in Argentina, Brazil, Chile, Korea, Mexico and Taiwan.

Although he was accused of accessing sensitive information, he did not steal any data. His motive appeared to be a hacking addiction. At the time of his 1998 extradition, Janet Reno acknowledged that a failure to combat hacking could create online chaos.

“If we aren’t vigilant, cybercrime will turn the internet into the Wild West of the 21st Century,” she said. “The Justice Department is determined to pursue cybercriminals at home and abroad.”

After detecting intrusions at Harvard, the FBI and the Naval Criminal Investigative Service applied for a search warrant and began to monitor activity using a program called I-Watch, run on a government computer installed at Harvard. It searched relentlessly through the goings-on of approximately 16,000 legitimate users of Harvard’s network in its attempt to pinpoint the hacker.

Matt Parsons, an FBI agent at the time, said the government went to great lengths to protect the privacy of Harvard’s network users. He said that the government isolated certain words the authorities believed the hacker was using, but did not immediately read those communications connected to those phrases.

“If a telltale word or phrase was intercepted, the monitoring computer initially would display up to 80 characters surrounding the target word or phrase,” he said. “If it remained ambiguous after these 80 characters were examined whether what had been intercepted was the activity of the intruder or a legitimate user, investigators used a computer utility program to look for further indicia of the intruder before actually examining the computer session.”

After more than two years battling and then waiving extradition, Ardita pleaded guilty in May 1998 in a Boston federal court to two counts of unlawfully intercepting and damaging government files.

One count concerned intercepting communications on the government computer orac.wes.army.mil, and one for transmitting a program named “zap” to mindy.nosc.mil — another government computer — in an attempt to damage its log files.

Ardita was fined $5,000 and sentenced to three years’ probation. A 2001 thesis at Argentina’s National Technical University footnoted Ardita as director of a firm called Cybersec S.A. Security System.

Friday, October 23, 2009

Computing Your Computer Needs

(CBS) There are more choices than ever if you're in the market for a new computer. Should you get one of those small, lightweight netbooks, or lug around a heavier notebook?

CNET-TV senior editor Natali Del Conte discussed the pros and cons of netbooks and notebooks on "The Early Show", to help you decide which best fits your computing needs.

Netbooks are the rage, but are a little under-powered, Del Conte said. Netbooks, she said, are good for e-mail and surfing the web. They are also good for travel, long battery life and light-duty processing. Storage is minimal.

Notebooks are a little heavier, Del Conte said. Notebooks, she explained, have a screen larger than 11 inches, and use the latest processors, bigger hard drives and shorter battery life.
However, notebooks are now being produced thinner and lighter without losing their computing power. In addition, Del Conte said notebooks are coming down in price to compete.

Other computers include the desktop, which is a full-powered computer. Desktops, Del Conte said usually have a lot of memory, disk space and speed. She said these computers are good for a family's computer needs, but are not portable. Some also have interactive touch screens.

Another computer making a splash in the market is the nettop, Del Conte said. The nettop, she said, has a touch screen monitor that's usually larger than a traditional computer screen. Del Conte said these computers are good for the kitchen or living room. She said they're good for browsing the web, viewing photos, and listening to music.

"It's more of an entertainment and browsing device," she said.

Notebooks:
• A working machine made for portability. 11 inches or bigger.
• Should have enough storage for your documents, music, photos, etc.
• Fast processors for multitasking. They are starting to come with
• Blu-ray players and HDMI ports for high definition video watching.

Sony Viao CW
• 14-inch widescreen display
• Blu-ray drive and HDMI out
• Up to 400 gigs of hard disk space
• 2.8 gigs of RAM
• Start at $800 and go up to $1500

Netbook:
• A small and portable computer that is less than 11 inches.
• Not a lot of storage, not a fast processor. Made for surfing the Web on the go.
• Slightly sluggish. Not made to be a main PC.

Nokia Booklet 3G
• First computer from Nokia
• Built-in 3G through AT&T
• 12 hours of battery life
• $299 with a two-year contract at $60/month of 3G service; $599 without a contract
For more information on the Nokia Booklet 3G, click here.

Premium Notebooks:
Notebooks are now becoming thinner, lighter without losing the computing power The prices are coming down as well, but high-end ones are still expensive.

HP Envy 13
• 13-inch widescreen display
• Ultra premium category
• Solid aluminum and edge-to-edge glass
• Oversize touchpad with multi-touch controls
• Dual-core Intel ultra-low voltage processor
• Optional Blu-ray player
• Starts at $1700

Desktop:
A family PC that stays in one place. Not portable. We are seeing the newest models come out that are touch screen, but that is certainly not necessary. Big hard drives, lots of RAM, made for all the family¹s computing needs.

Acer All-in-One PC
• Multitouch screen
• 23-inch display
• Intel Pentium Dual Core processor
• 320 gigs of hard disk space
• 4 gigs of RAM
• Starts at $900
For a review of the Acer All-in-One PC, click here.

Nettop:
Family-style computers that are great for the kitchen or living room. Built-in cameras for Web chatting. Touch-navigation for Internet browsing, or playing media like movies, music, or photos.

HP Touchsmart 300
• CNET Editors Choice
• Multitouch screen
• Built-in wireless
• HP TouchSmart Suite of applications for advanced media applications like photo viewing, music playing, movie watching, etc.
• 20-inch widescreen
• Starts at $900

Wednesday, October 21, 2009

Six things you need to know about the BCS standings

For starters, Florida and Alabama are in the driver's seat in the race for the Jan. 7 BCS National Championship Game in Pasadena, Calif.

Defending BCS national champion Florida is No. 1 in the initial BCS standings, followed by No. 2 Alabama and No. 3 Texas.

In the first 11 years of the BCS, the top two teams in the initial standings have reached the title game once (Texas and Southern California in 2005). Nearly half of the 22 teams that debuted at No. 1 or 2 advanced to the BCS National Championship Game.

So as long as the Gators and Crimson Tide win the rest of their regular-season games, the Dec. 7 SEC championship game in Atlanta's Georgia Dome will be a play-in game for the BCS National Championship Game.

The bad news for those SEC teams? Ohio State and Oklahoma won't be their sacrificial lambs for the fourth straight season.

The Buckeyes, who lost badly to Florida and LSU in two of the past three BCS championship games, were all but eliminated from the BCS championship race with their stunning 26-18 loss at Purdue on Saturday. Oklahoma fell to 3-3 following its 16-13 loss to Texas in the Red River Rivalry game in Dallas.

Here are a half-dozen other things to know about the initial BCS standings:

Alabama
1. Could we have an Alabama-Florida rematch in Pasadena?

It seems unlikely, and Big Ten commissioner Jim Delany might threaten secession from the BCS if it happened. The last such scenario happened in 2006, when No. 1 Ohio State defeated No. 2 Michigan 42-39 in the teams' regular-season finale.

Two weeks later, after UCLA upset No. 2 USC 13-9 and No. 4 Florida beat Arkansas 38-28 in the SEC championship game, the Gators jumped to No. 2 in the final BCS standings. The one-loss Gators, instead of the one-loss Wolverines, got a shot at the undefeated Buckeyes. Florida blasted Ohio State 41-14 in the BCS title game.

The Crimson Tide have a sizable advantage (.0615 points) over No. 3 Texas in the initial BCS standings. The gap between Alabama and No. 4 Boise State is more than twice as large (.1443). So if the No. 3 Longhorns lose one of their final six regular-season games, an SEC rematch in Pasadena might not be completely out of the question.

Florida
2. Why is Florida ranked ahead of Alabama?

Using the eyeball test, Alabama looks like the country's most complete team right now. But Florida is ranked No. 1 in each of the three components of the BCS formula -- the coaches' poll, Harris Interactive Poll and average in six computer rankings (the Crimson Tide are No. 1 in The Associated Press Top 25 poll, which is no longer used in the BCS recipe).

The BCS computers love the Gators, who are No. 1 in four of the six computer polls. I'm not sure which season the computers have been watching. The Crimson Tide have already beaten two teams ranked in the BCS standings: No. 14 Virginia Tech and No. 24 South Carolina. Florida has beaten one ranked team: No. 9 LSU.

USC
3. So how many teams are left in the BCS hunt?

Historical data suggests about a dozen teams are still in the race. Since the BCS began in 1998, no team ranked No. 13 or worse in the initial BCS standings has reached the national championship game. LSU debuted at No. 12 in 2003 -- the lowest initial ranking for an eventual BCS national champion -- before climbing its way to the title game.

That means Georgia Tech is the last team with any championship hopes. The No. 12 Yellow Jackets have already played the meat of their schedule, including a 28-23 upset of then-No. 4 Virginia Tech on Saturday night. Only two of Georgia Tech's remaining five regular-season opponents currently have a winning record, and both of those teams -- Wake Forest and Georgia -- will play at Bobby Dodd Stadium.

No. 7 USC is in the best position among the one-loss teams, but the Trojans are being heavily penalized for their 16-13 loss to Washington on Sept. 19. USC is No. 11 in computer average behind teams like Oregon and Georgia Tech, but with games left at the No. 11 Ducks and home against No. 22 Arizona, USC can greatly improve its computer profile.

Boise State
TCU
4. Does Boise State really have a chance to play for the BCS championship?

It would probably take a doomsday scenario for the teams from BCS conferences for it to happen. For Boise State to reach the BCS Championship Game, there could be only one unbeaten team from the ranks of Alabama, Florida, Texas, Cincinnati, Iowa and TCU. And even that scenario might not be enough for the Broncos to reach Pasadena.

Even if Boise State finished the regular season with a 12-0 record, it might be passed by one-loss teams such as USC, LSU and Miami.

The Broncos are No. 4 in the BCS standings, but it's hard to imagine them going any higher because of their future schedule strength (104th, according to the NCAA). They lost 17 points in the coaches' poll last week, even after winning at Tulsa 28-21.

The good news for Boise State? It has seven more chances to impress voters and the computers (the Broncos play a 13-game regular-season schedule because they play at Hawaii). The bad news? Its remaining seven opponents have a combined record of 19-25.

Boise State really needs No. 8 TCU to lose. The Horned Frogs would probably pass the Broncos in the BCS standings if they win at No. 16 BYU on Saturday and beat No. 18 Utah at home Nov. 14.

Only one team from a non-BCS league is guaranteed a spot in a BCS bowl game -- if it finishes in the top 12 of the final BCS standings, or in the top 16 and is ranked ahead of one of the six champions of the BCS leagues.

Cincinnati
Iowa
5. Does Cincinnati or Iowa have a better chance of reaching Pasadena?

Based on schedule strength, the Hawkeyes seem to be in a much better position than the Bearcats. Iowa has played the country's ninth-toughest schedule, according to the NCAA, and it's getting a lot of mileage out of its nonconference victory over No. 22 Arizona and road win at No. 13 Penn State.

Iowa plays only five more games, starting Saturday at improving Michigan State. The Hawkeyes' future schedule is ranked No. 36 by the NCAA and they could get a big bump by winning at No. 19 Ohio State on Nov. 14, too.

Cincinnati's schedule strength (tied-67th, according to the NCAA) is worse than Boise State's, which is surprising. The Bearcats have already won at Oregon State and South Florida. The Bearcats can make up ground by beating No. 23 West Virginia at home Nov. 13 and No. 20 Pittsburgh in their Dec. 5 regular-season finale. But Cincinnati's future schedule is ranked 80th-toughest in the country, according to the NCAA.

Texas
6. So how does Texas get jobbed this season?

As long as the Longhorns win their final six games, it's hard to imagine them being left out of the BCS Championship Game. The Longhorns don't have to worry about Oklahoma, so a three-way tie in the Big 12 South seems like a very remote possibility.

Texas still plays two ranked opponents: at No. 15 Oklahoma State on Oct. 31 and No. 25 Kansas at home Nov. 21.

"We were sitting here in this same position last year and after we lost to Texas Tech we let it go back into a system and the system kept us out," Texas coach Mack Brown told reporters after beating Oklahoma. "Now next week's game becomes bigger than this one and it'll progress like that for the rest of the year."

Six things you need to know about the BCS standings

For starters, Florida and Alabama are in the driver's seat in the race for the Jan. 7 BCS National Championship Game in Pasadena, Calif.

Defending BCS national champion Florida is No. 1 in the initial BCS standings, followed by No. 2 Alabama and No. 3 Texas.

In the first 11 years of the BCS, the top two teams in the initial standings have reached the title game once (Texas and Southern California in 2005). Nearly half of the 22 teams that debuted at No. 1 or 2 advanced to the BCS National Championship Game.

So as long as the Gators and Crimson Tide win the rest of their regular-season games, the Dec. 7 SEC championship game in Atlanta's Georgia Dome will be a play-in game for the BCS National Championship Game.

The bad news for those SEC teams? Ohio State and Oklahoma won't be their sacrificial lambs for the fourth straight season.

The Buckeyes, who lost badly to Florida and LSU in two of the past three BCS championship games, were all but eliminated from the BCS championship race with their stunning 26-18 loss at Purdue on Saturday. Oklahoma fell to 3-3 following its 16-13 loss to Texas in the Red River Rivalry game in Dallas.

Here are a half-dozen other things to know about the initial BCS standings:

Alabama
1. Could we have an Alabama-Florida rematch in Pasadena?

It seems unlikely, and Big Ten commissioner Jim Delany might threaten secession from the BCS if it happened. The last such scenario happened in 2006, when No. 1 Ohio State defeated No. 2 Michigan 42-39 in the teams' regular-season finale.

Two weeks later, after UCLA upset No. 2 USC 13-9 and No. 4 Florida beat Arkansas 38-28 in the SEC championship game, the Gators jumped to No. 2 in the final BCS standings. The one-loss Gators, instead of the one-loss Wolverines, got a shot at the undefeated Buckeyes. Florida blasted Ohio State 41-14 in the BCS title game.

The Crimson Tide have a sizable advantage (.0615 points) over No. 3 Texas in the initial BCS standings. The gap between Alabama and No. 4 Boise State is more than twice as large (.1443). So if the No. 3 Longhorns lose one of their final six regular-season games, an SEC rematch in Pasadena might not be completely out of the question.

Florida
2. Why is Florida ranked ahead of Alabama?

Using the eyeball test, Alabama looks like the country's most complete team right now. But Florida is ranked No. 1 in each of the three components of the BCS formula -- the coaches' poll, Harris Interactive Poll and average in six computer rankings (the Crimson Tide are No. 1 in The Associated Press Top 25 poll, which is no longer used in the BCS recipe).

The BCS computers love the Gators, who are No. 1 in four of the six computer polls. I'm not sure which season the computers have been watching. The Crimson Tide have already beaten two teams ranked in the BCS standings: No. 14 Virginia Tech and No. 24 South Carolina. Florida has beaten one ranked team: No. 9 LSU.

USC
3. So how many teams are left in the BCS hunt?

Historical data suggests about a dozen teams are still in the race. Since the BCS began in 1998, no team ranked No. 13 or worse in the initial BCS standings has reached the national championship game. LSU debuted at No. 12 in 2003 -- the lowest initial ranking for an eventual BCS national champion -- before climbing its way to the title game.

That means Georgia Tech is the last team with any championship hopes. The No. 12 Yellow Jackets have already played the meat of their schedule, including a 28-23 upset of then-No. 4 Virginia Tech on Saturday night. Only two of Georgia Tech's remaining five regular-season opponents currently have a winning record, and both of those teams -- Wake Forest and Georgia -- will play at Bobby Dodd Stadium.

No. 7 USC is in the best position among the one-loss teams, but the Trojans are being heavily penalized for their 16-13 loss to Washington on Sept. 19. USC is No. 11 in computer average behind teams like Oregon and Georgia Tech, but with games left at the No. 11 Ducks and home against No. 22 Arizona, USC can greatly improve its computer profile.

Boise State
TCU
4. Does Boise State really have a chance to play for the BCS championship?

It would probably take a doomsday scenario for the teams from BCS conferences for it to happen. For Boise State to reach the BCS Championship Game, there could be only one unbeaten team from the ranks of Alabama, Florida, Texas, Cincinnati, Iowa and TCU. And even that scenario might not be enough for the Broncos to reach Pasadena.

Even if Boise State finished the regular season with a 12-0 record, it might be passed by one-loss teams such as USC, LSU and Miami.

The Broncos are No. 4 in the BCS standings, but it's hard to imagine them going any higher because of their future schedule strength (104th, according to the NCAA). They lost 17 points in the coaches' poll last week, even after winning at Tulsa 28-21.

The good news for Boise State? It has seven more chances to impress voters and the computers (the Broncos play a 13-game regular-season schedule because they play at Hawaii). The bad news? Its remaining seven opponents have a combined record of 19-25.

Boise State really needs No. 8 TCU to lose. The Horned Frogs would probably pass the Broncos in the BCS standings if they win at No. 16 BYU on Saturday and beat No. 18 Utah at home Nov. 14.

Only one team from a non-BCS league is guaranteed a spot in a BCS bowl game -- if it finishes in the top 12 of the final BCS standings, or in the top 16 and is ranked ahead of one of the six champions of the BCS leagues.

Cincinnati
Iowa
5. Does Cincinnati or Iowa have a better chance of reaching Pasadena?

Based on schedule strength, the Hawkeyes seem to be in a much better position than the Bearcats. Iowa has played the country's ninth-toughest schedule, according to the NCAA, and it's getting a lot of mileage out of its nonconference victory over No. 22 Arizona and road win at No. 13 Penn State.

Iowa plays only five more games, starting Saturday at improving Michigan State. The Hawkeyes' future schedule is ranked No. 36 by the NCAA and they could get a big bump by winning at No. 19 Ohio State on Nov. 14, too.

Cincinnati's schedule strength (tied-67th, according to the NCAA) is worse than Boise State's, which is surprising. The Bearcats have already won at Oregon State and South Florida. The Bearcats can make up ground by beating No. 23 West Virginia at home Nov. 13 and No. 20 Pittsburgh in their Dec. 5 regular-season finale. But Cincinnati's future schedule is ranked 80th-toughest in the country, according to the NCAA.

Texas
6. So how does Texas get jobbed this season?

As long as the Longhorns win their final six games, it's hard to imagine them being left out of the BCS Championship Game. The Longhorns don't have to worry about Oklahoma, so a three-way tie in the Big 12 South seems like a very remote possibility.

Texas still plays two ranked opponents: at No. 15 Oklahoma State on Oct. 31 and No. 25 Kansas at home Nov. 21.

"We were sitting here in this same position last year and after we lost to Texas Tech we let it go back into a system and the system kept us out," Texas coach Mack Brown told reporters after beating Oklahoma. "Now next week's game becomes bigger than this one and it'll progress like that for the rest of the year."

Tuesday, October 20, 2009

Many ways to preserve a computer password

Forgot your password?

How often have you admitted that with a click, then had to wait for a rescue e-mail? And maybe just to get that e-mail on its way, you first had to answer questions about your dog, your mother or your favorite brand of toothpaste.

Sometimes even that doesn't help. It may take a professional to get a person out of password purgatory.

About half a dozen times a month, someone hauls in a computer that won't budge without its codeword, said Nia Joseph, manager of Computer Outlets on Colonial Boulevard in Fort Myers. After her staff makes sure it's not because the computer was stolen - and that's why the password is unknown - they use a special program to unlock the machine and reset the password.

Breaking into your own computer could cost you $50 at Computer Outlets or another repair shop.

A few weeks ago, I wrote a column for the newspaper and moaned about my lack of password proficiency. I asked readers for suggestions.

Two dozen readers responded. Here are some of their ideas:

- Dolph Secula suggested changing passwords once a month. Choose one basic word for even-numbered months and one basic word for odd-numbered months, adding a numerical character to each. That way you can guess your password in one or two tries, even if you forget.

- Jim Scollen suggested a password-hiding grid at vvsss.com/ grid/.

- Bruce Johnson wrote about several online helpers, including LastPass, which he likes best. But there is also PassPack and Roboform (at roboform.com/), which stores and installs up to 10 passwords.

Johnson also suggests a handheld password organizer from Atek called the Logio Secure Password Organizer, which stores more than 200 records and costs $29.95. Go to atek.com/logio-secure-password-organizer.html.

- Joan Cross uses a similar program for Macintosh computers, at agile websolutions.com.

- Dianne Rushton, George Shelley and Jo-Z Honeycutt wrote with low-tech approaches. Designate one little address book or notebook for password storage and use a page for each entry.

- One card for each in a recipe file box works best for Ellie Vetter; Marie Jones of Cape Coral uses one Rolodex card for each.

-Cheryl Payne of Fort Myers swears by a simple Excel spreadsheet. Use separate columns for company name/Web address; user name/e-mail address; password; answer to security question.

- Joyce Bradley of Alva sensed that perhaps the problem sometimes runs deeper than passwords.

"I am no psychiatrist or even a professional counselor," Bradley wrote, "but I would first say to you to simplify your life. Just like cleaning out your closet, you should get rid of the extra accounts and keep what you use on a regular basis. By all means, organize!"

Saturday, October 17, 2009

Netbook Sales Outpacing Sales of Traditional Notebook Computers


Netbook sales are taking off, according to a new report from DisplaySearch. The report indicates that netbook revenues were up 37 percent over the same quarter last year and up 264 percent for the year. Revenues from larger notebook PCs were down. But portable notebooks with screens 13" to 16" showed growth as well.

The mini-note category, which includes netbooks, now represents 22.2 percent of the portable PC shipments and 11.7 percent of revenues. The growth in netbooks, said DisplaySearch, is owing to the low prices, which make them especially suitable as a second PC and attractive to first-time buyers.

Telecom providers are jumping in too, hoping to boost revenues by offering subsidized mini-notes when the customer signs a two-year data plan contract. In North America, the devices are also being offered as enhancements by cable TV providers to entice customers to sign up for a package of their services.

"Mini-notes have been a significant contributor to volume growth in the portable PC market as their very attractive price points make owning a secondary computer viable for many consumers. However, the lower ASPs of these devices are clearly having a negative impact on portable PC market revenue," said John F. Jacobs, Director of Notebook Market Research. "For 2009, we expect continued ASP erosion across all portable computer categories, leading to the first Y/Y decline of portable computer revenue."

DisplaySearch said it expects these revenue and shipment trends to continue into 2010, with mini-notes accounting for 21.5 percent of shipment volume but just 10.9 percent of total revenue for the portable computer market in 2010.

About the Author

Denise Harrison is a freelance writer and editor with 20 years of experience. She specializes in technology, specifically in audiovisual and presentation. She works as a consultant for Second Life projects and is involved with nonprofits and education within the 3D realm. She can be reached here.

Thursday, October 15, 2009

Eager to upgrade to Windows 7? Be sure your PC can take it

You're tempted by Windows 7, the superior operating system Microsoft delivers next week. But you're not sure if your computer can run the latest software, or if you can avoid the palpitations that typically accompany the upgrade ordeal.

It depends on where you're coming from. If you're running Windows Vista, Microsoft's current operating system, the move to Windows 7 should be relatively stress-free. If you're traveling from the older XP, brace yourself for a more exasperating time.

Not all of you should upgrade an existing PC to Windows 7, even if you pine for a better computing experience. (Windows 7 is far friendlier than the Windows Vista operating system it replaces on Oct. 22 — my full review is coming shortly.) There are risks, however small, associated with surgery, and migrating from one computer operating system to another qualifies as a big-time operation.

If your computer is getting close to retirement anyway, take the plunge and buy a new PC if you can afford one. A new machine in which Windows 7 comes installed arrives with far fewer hassles, naturally. In this troubled economy, there are almost certainly deals to be had. Versions of Windows 7 even run on sub-$500 computers.

If you're inclined to upgrade, the procedure doesn't have to be painful. My own experience has been smoother than any Windows upgrade I've performed in the past. It costs $120 to upgrade to the Home Premium version of the software, sure to be the most popular. What you need to know:

•Does my computer have what it takes? Most PCs bought in the last few years should handle Windows 7 just fine. But keep in mind you'll need at least 1 or 2 gigabytes of RAM and 16 to 20 GB of free disk space (more is better, of course). The sums depend on whether you have an older 32-bit type computer or newer 64-bit system. In simple terms, 64-bit machines can digest more data in any one chunk. By contrast, 32-bit systems can't take advantage of more than 3 GB of memory no matter how much RAM is installed.

Meeting minimum storage requirements might be an even bigger obstacle on some systems, especially if you had a cramped drive to begin with.

•Upgrading from XP. There are way more Windows XP computers out there than there are Vista PCs. Upgrading from XP is far more tedious.

You must perform what Microsoft calls a "custom" or "clean" installation. Essentially, you're starting from scratch. A clean install involves wiping out programs, files and settings on the machine's drive, then restoring the whole enchilada (or at least the stuff you can't live without) afterwards.

Before proceeding, you'll want to copy files and settings onto an external drive or network. But your programs are another matter. Post surgery, you'll have to dig out original installation disks (if you can find them), and reload all the programs, including any software patches or updates that came later.

You can copy the files and settings (but not programs) that you previously had backed up using Microsoft's Easy Transfer wizard. Even Microsoft says some people might seek technical assistance.

Though I haven't tested it, Laplink's PCmover Windows 7 Upgrade Assistant software promises to help you upgrade from any version of Windows (from 2000 on), without copying files to another drive. Your hardware must still be up to snuff. Laplink's software costs $15 under a promotion that runs through Oct 22. Details are at Laplink.com.

•Upgrading from Vista. The move from Vista is a breeze by comparison, though you still may see a few bumps. As before, insert an installation CD, but this time you can choose an option in which programs, files and settings are retained. (You can still select the custom option if you're into housecleaning.) Upgrading in this fashion takes anywhere from 45 minutes to several hours, as was the case with me. When all is said and done, your data ought to be intact. They were in my tests.

•Will stuff work afterwards? Run the Windows 7 Upgrade Advisor software, free from Microsoft's website. It tells you if there are known incompatibilities with third-party programs and hardware. The Advisor recommended that I get the latest video driver from Nvidia for one of my PCs.

On a second machine, the Advisor program suggested that I remove, then (after Windows 7 is onboard) reinstall the Trend Micro security software on my PC. Since I didn't have an original disk, I called Trend Micro customer service. They told me that my existing software would not work with Windows 7, regardless of what the Advisor program said, and that I'd need a new version. Since I had months to go on my subscription, Trend Micro automatically upgraded me to the new program at no charge, and it's all gone smoothly ever since.

It may be time for Vista to rest in peace. That doesn't mean your computer has to call it a day.

This is the first quarter where PC shipments have increased. For all companies, total shipments grew 3.9 percent.

“These are good results especially given that PC shipments for the third quarter of 2009 are being compared to a very strong third quarter from 2008,” said Mikako Kitagawa, principal analyst at Gartner. “Sequentially, third quarter shipments grew 18 percent, which is higher than the historical seasonal growth from the second to third quarter.”

Tuesday, October 13, 2009

Computer Co-pilots 'Smarter' UAVs Could Prevent Human Error During Landings

The answers are said to be coming for all those U.S. Air Force UAV crashes: stouter, smarter airplanes that would be controlled by computers during takeoffs and, most importantly, landings, when the craft are hardest to control.

In the meantime, the Air Force is about to field a laser altimeter that could make its Predators and Reapers easier to fly until that automated takeoff-and-landing system is ready for the Reapers in 2012. The Reapers are scheduled to take over completely for the Predators in 2016.

For now, personnel inside launch and recovery stations overseas continue to guide the Predators and Reapers in with joysticks at the end of each mission. These operators are given control of the aircraft after pilots and sensor operators at bases in the United States are done taking videos or striking targets with Hellfire missiles. Local teams must land the planes because the signal delay from the United States would make them all but impossible to control.

A total of 65 Predators have crashed, including three so far this year. Thirty-six of the crashes were attributed to human error, and about half of those occurred during landing, according to Air Force records.

The issue of human error came to the fore in 2008, when then-Lt. Col. Robert Herz, a researcher at the Air Force Research Laboratory, reported that 71 percent of Predator crashes between 2003 and 2006 resulted from human error.

Despite the crashes, yesterday isn't fast enough for those who want a cloud of video-equipped UAVs over Afghanistan. The Predators and Reapers are such tricky aircraft to fly that the Air Force counsels patience as delicately as it can when Defense Secretary Robert Gates demands more and more planes in the air to support ground troops.

"It's been difficult to keep up with the demand and to do it in a smart way that meets both the war-fighters' needs and those of our crews," said Maj. Kathryn Nelson, who is helping to plot the future of the UAVs at Air Combat Command in Virginia.

Nelson, a former B-1 bomber pilot, has flown Predators and Reapers over Afghanistan for the better part of five years. So, too, has Maj. Matt Martin, a Predator pilot who oversees UAV training and operations at Air Combat Command. He offers some different arithmetic from that of Herz.

"The [Herz] study didn't capture all of the most recent and relevant data," Martin said. "Right now, our attrition rate - which is different from our mishap rate - is 5.3 attritions per 100,000 hours."

Martin also said the earlier data do not take into account the increased demands on the UAVs, which had them flying more than 100,000 hours last year.

All Predator crashes are "mishaps," but not all mishaps result in "attrition," or the loss of one of the $4 million aircraft.

"About half of those attritions are the result of some kind of mechanical failure, and half is a mixture of human factors and combat losses, or decisions to take a higher risk," Martin said. Of the human-error losses per 100,000 hours, "about two of those are a result of crew error, and one is a landing mishap."

Because of the delay in signal transmission over 7,000-plus miles from the U.S. bases, forward-deployed pilots handle the takeoffs and landings through a C-band line-of-sight data link.

It's not the same thing as being in the cockpit.

"It's a very challenging airplane to land," said Martin, a former RC-135 pilot who has commanded a forward-deployed takeoff-and-landing unit in Iraq and continues to fly one weekend a month at Creech Air Force Base, Nev., to stay current in the Predator.

Much of that difficulty in taking off and landing is visceral.

Martin cited "the combination of not being aboard the airplanes so you can't hear the engines spool up, you don't feel the ground rush, combined with you having no peripheral vision because you're looking through a nose camera, and you have to do a purely visual interpretation of your instruments."

Predators flew 138,404 combat hours in 2008, up 94 percent from a year earlier. Reapers flew 12,770 combat hours, and those figures are expected to double this year.

Those figures are driven by Gates' demand for the Air Force to establish 50 combat air patrols (CAPs), the term for 24-hour-a-day coverage zones, by 2011. The service is providing 35 CAPs now, 31 with Predators and four with Reapers, and the increase from 35 to 50 is expected to be all Reapers. The last Predator is scheduled to be delivered in 2011.

An automated takeoff-and-landing system is due for the Reaper in 2012. In the meantime, tests on the laser altimeter were completed in early July, and modification kits were due out in August to offer pilots a better sense of the runway.

A pitch-indication system is being tested and is not yet mature enough for use. But it likely will be speeded up so pilots will know that the nose of the plane is up during landing. The Air Force says that touching down nose-wheel first is the primary cause of human-faulted crashes during landing.

But even the auto-takeoff-and-landing system won't completely remedy the issue, because it will be limited by crosswinds. Anything above 10 knots will be handled by a pilot.

And what of the Predator? The demands of an auto system are too heavy for the lightweight craft, the Air Force says.

"The Reaper is a bigger, faster aircraft, 10,000 pounds [compared with] 2,300 pounds," Nelson said. "We don't think combatant commanders would want to take off fuel [from the Predator], which takes off persistence, or any of the other payloads that we could bring, be it weapons or sensors."

Instead, the laser altimeter tested on the Reaper is being considered for the Predator. It would offer input that would keep the pilot from putting the airplane into a negative pitch.

"That's autopilot logic software that's available by adding only the laser altimeter as a physical modification," Nelson said. "It would allow the pilot not to get himself into trouble, providing an 'assistant.' The pilot would still be doing the landing, but he would be landing a smarter airplane."

The question remains: Why has this taken so long? "The Air Force just told us they weren't interested" in buying an auto-takeoff-and-landing system, said Tom Cassidy, president of General Atomics Aircraft Systems Group. He said the system was not too heavy for the Predator.

Regardless, the Air Force admits it's still trying to catch up to technology that's 15 years old, but which has largely been adapted in a hurry-up fashion. In a radical departure from most Air Force planes, which are developed and tested over decades, the Predator was in the service's arsenal soon after the contract with General Atomics Aeronautical Systems was signed.

That contract followed a General Atomics sales pitch that foresaw the demand for the intelligence, surveillance and reconnaissance offered by the UAV, with the potential for weaponry. Since the UAVs were introduced into combat, ground commanders continually have demanded more intelligence coverage. ■

Monday, October 12, 2009

Sony VAIO VGN-FW41E/H


Not everyone needs a laptop to lug around all day — some people just want something smaller than a desktop PC that they can move around at home. This is the realm of the desktop replacement PC and if you’re happy to spend a few quid, you can pick up a portable that packs all of the power of a mid-range desktop — and the Sony VAIO VGN-FW41E/H is a prime example.

Like all desktop replacements, the Sony VAIO VGN-FW41E/H is big, but it’s also much better looking than many of the mega-screen monsters that qualify for this laptop category. The 16.4in screen doesn’t make much difference to the overall size — the VAIO VGN-FW41E/H still measures 384mm across — but at 37mm, it’s pretty slim for a laptop this size and Sony has done a great job with the design. This isn’t the lightest of laptops, but the 3.3kg shouldn’t prove too much of a problem if you’re just looking for a laptop to lug from room to room.

Read the rest of the review after the cut.

Although it’s all-plastic, the Sony VAIO VGN-FW41E/H’s silver and black feels solid, and its clean, understated appearance also helps make it look less imposing than other models of a similar size. There are a few flourishes — the hinge-ends are the usual circular VAIO shape and house the green-glowing power button and mains input socket.

Lift the lid and the benefits of a buying a desktop replacement become apparent — that big screen and a full-size keyboard. The screen has a glossy finish and while this leads to inevitable reflections with dark images, this is less of an issue for a laptop like this and the resulting improved contrast is worth it. The 1600 x 900 resolution isn’t quite high enough for full 1080p HD video playback, but it’s obviously sufficient for enjoying 720p playback from the internal Blu-ray drive. Although they’re fine for applications and YouTube videos, the stereo speakers that sit at the back of the keyboard aren’t quite up to blasting out movie soundtracks, though.

The keyboard has the now-standard Sony low-profile ‘Scrabble tile’ design and, as long as you like these kind of keys, it works pretty well. Top row of function keys aside, the keys are all full-size, but there’s no separate numeric keypad — not a problem for many people, but still a little unusual for a laptop this size. The trackpad is reasonably large and works well, too — its matte finish doesn’t have the same sticky-finger problems as some other laptops we’ve reviewed recently…

Performance isn’t main reason for buying a desktop replacement laptop, but it’s worth considering if you want to run the same broad range of software that most mid-range desktop set-ups can cope with. You certainly shouldn’t have any problems running any applications you choose with the Sony VAIO VGN-FW41E/H’s 2GHz Core 2 Duo T6400 processor and 4Gb of RAM, and the 500Gb hard disk provides plenty of space for programs, too.

Gaming is often a trickier proposition for any laptop and the ATI Mobility Radeon HD4650 chipset inside the Sony VAIO VGN-FW41E/H doesn’t have enough power to play the latest 3D titles at the screen’s native resolution — or at least not if you want more than a few frames per second. Drop the resolution and detail settings and you should be able to get playable performance from most titles, but this does make the VAIO VGN-FW41E/H more suited to casual gamers than hard-core players.

Battery life is seldom worth worrying about for a laptop this size, not least because you’d be foolish to take something so large out for use on the road. Some life away from the mains can be useful, though — like when you want to catch up on email in the garden without running an extension lead through the back door. In fact the VAIO VGN-FW41E/H doesn’t fare too badly in this regard and if its heavy and light use battery scores are anything to go by, you should be able to get about two and a half hours of unplugged use.
Sony VAIO VGN-FW41E/H

Price
£849 inc VAT
Rating
5 out of 6
Good
Good all-round spec; great keyboard, good-looking
Bad
Lacks a bit of 3D graphics grunt
Verdict
Good-looking and a great performer, there’s a lot to like about the VAIO VGN-FW41E/H and even Sony’s price isn’t too outrageous for the specification on offer.
Manufacturer
Sony
Buy from
Sony


Specifications

Processor
Intel Core 2 Duo T6400 (2GHz)
Memory
4Gb (8Gb max)
Graphics
ATI Mobility Radeon HD4650 (512Mb)
Hard disk
500Gb SATA
Optical drive
Blu-ray combo
Floppy drive
NA
Screen
16.4in (1600 x 900)
Connectivity
802.11n, Bluetooth 2.1, VGA, HDMI, FireWire, USB x3, ExpressCard/34, modem, Memory Stick, SD Card, mic, headphones
Other
Web cam
Operating system
Windows Vista Home Premium
Size
261 x 384 x 37mm
Weight
3,3kg
Battery life
Battery Eater Classic: 1h 47m
Battery Eater Reader: 3h 42m

Sunday, October 11, 2009

Polar Bears and Penguins

I have found an interesting comparison between computer Operating Systems and bee hives.

Most of it has to do with the people using them.

You see, when people want an Operating system, they have options, based on what criteria they have in mind for their usage.

For example, let's take the "average Joe" that everyone talks about. While the "average Joe" is really anything but average, what he/she wants is to get online, play multimedia, email, maybe hit some games, online and installed, tweet, print pictures from a digital camera, everyday kind of stuff.

Now, is "Joe" a flashy, got to look good while I work kind of guy, or is Joe a "so long as it gets done, I don't care how it looks" kind of guy. Does it have to have all the bells and whistles? Must it be something he can show off to his buddies and brag about how expensive and exclusive it is, thereby proving his level if sophistication and "I'm better than you" mentality?

There are a variety of OS's that will provide just what each type of user is looking for. Windows is kind of the mid-range, get it because it's whats there, don't have to think about it consumer.

There is Apple for the Cadillac through Mercedes crowd who simply must let everyone know they have the money to have less viruses as well as run Windows apps.

Then there is Linux, mostly for the Do It Yourself-er, "I don't need to spend all that money just to do ...", kind of folks.

In beehives, you have same types of consumers. You have the " I just want what everyone else gets" and call it a day buyer. they just want some hives and don't feel like or know how to build it themselves.

Then you have the "It simply must fit in with my backyard decor" crowd who want to let everyone know they are into the latest "save the..." trend and can look good while doing it. They buy hives that will make average price hives seem cut rate by comparison and include all the decorations and accessories you could ever want.

Then you have the Do It Yourself-er who looks at the accessories and designs, thinking how nice they look but "OH MY GOD NO" when they see the cost and figure they can build the same doggone thing for about one fourth to one third the price and be done with it. It doesn't have to look like a work of art to do it's job.

I am one of those DIY guys. I don't need all the fancy decorations and cute accessories. I don't use them most of the time, if ever. As far as I am concerned, as long as it does what I expect it to do, I'm happy. For a fraction of the cost or maybe just the time invested instead of having it done for me.

Linux does what I need it to do. I don't play all the whiz bang games and I don't need the 3D fancy pants stuff. That just slows me down.

I build my own bee hives. For between 30 and 50 bucks, I can have a solid, sturdy weather proof hive that even has an observation window so I don't have to bother the gals inside as often. That's good enough for me and my bees. We aren't out to win any prizes or impress anyone. I am way to old and way too ugly to start caring about that now.

Windows and Mac and Linux all have a unique pace in the world where they work best. Same with all the hives. From the homemade Top Bar to the top dollar "Kerkhoff" derived hives, there is a home and a beekeeper for all of them.

This ridiculous arguing about one OS having to dominate all the others, only one of them being able to be 'the best way' or 'the right way' is missing out on the bigger world where not all people fit into the same user mold. Same as beekeepers, for every 5 beekeepers, you will hear 10 different ways of what is "the best" way to keep bees.

Truth is, "the best" way is the way that works for you, the way you need it to work.

Saturday, October 10, 2009

Brain-Computer Interface Lets You Communicate Your Thoughts Throught the Web (Without Blogging)

Researchers have long been developing brain-computer interfacing (BCI) systems to enhance the quality of life for paralyzed or disabled people, enabling them to control gadgets such as computers and wheelchairs using only their minds. But the devices haven’t allowed humans to communicate with each other without speaking—until now.

Christopher James of the University of Southampton’s Institute of Sound and Vibration Research has devised a way to achieve brain-to-brain communication using BCI technology—effectively allowing a person to send his or her thoughts/brainwaves through the Internet.

The process involves two people who are attached to an EEG amplifier, two computers, an Internet connection, and one LED lamp. In the test, the first subject was asked to transfer his thoughts through a computer. The thoughts were hardly personal —the subject was simply asked to move his arm, meaning he had to think “move my arm.” His thoughts were translated into computer language consisting of a series of binary digits, zeros and ones. For example, when he raised his right arm, the computer read a one, and when he raised his left arm, the computer read a zero.

When the computer on the receiving end picked up the signals sent via the ‘net, the second subject saw flashing LED lights. Through these light patterns, the thoughts of the first person were transferred. To be sure the second party understood what was happening, James used a computer to confirm that the second person’s brain activity was creating the same ones and zeros. The digits matched, showing that the communication of the thoughts were indeed properly conveyed.

Still curious about the experiment? Check out this video for more info.

Thursday, October 8, 2009

Computer company using, promoting Corning Inc.'s 'gorilla glass'

CORNING -- Corning Inc. and Motion Computing Inc. of Austin jointly announced Tuesday the Texas computer maker would use Corning's Gorilla specialty glass in two of its laptop notebook computers.

The Texas firm, which specializes in durable and lightweight computers, is telling consumers about the specialty glass as part of its marketing.

Corning's glass improves screen durability without adding weight to the device, both companies said.

Gorilla Glass is a thin-sheet glass designed to function as a cover glass for portable display device screens. Its composition allows a deeper layer of chemical strengthening than is possible with other strengthened glasses. The end result, said Mark Matthews, vice president of Corning's Technical Materials division is a display glass cover that is scratch resistant and nearly unbreakable.

When Corning Inc.'s high-strength Gorilla glass was developed for auto windshield glass in the 1960s, it wound up on the shelf because of concerns an unbreakable windshield would cause more head injuries that it would prevent.

But about two years ago, Corning started hearing from its customers about their need for a stronger glass.

Gorilla glass was dusted off, reintroduced to the marketplace and is now expected to bring in close to $100 million in sales this year, Matthews said.

Motion's two notebook PCs that use Corning's specialty glass were designed for mobile use in the health care, construction, field service or manufacturing industries.

Trailcon Leasing, an Ontario-based trailer rental company, ordered several more computers after learning about the availability of Corning's Gorilla glass, Motion's IT manager, Stuart Innes, said.

Matthews said 12 major brand names -- including Dell, Samsung, LG and Motorola -- manufacture about 30 devices between them that utilize Gorilla glass. But Motion Computing, Matthews said, is the only company that mentions Corning's patented trademark Gorilla glass in the marketing of its PC notebooks.

He also said Corning is seeking out industrial uses for its high-strength display glass.

"The glass isn't part of the display but it protects the display," said Matthews. "So banking machines, vending machines, GPS units, you can use the glass on any devices with screens you touch."

Computer company using, promoting Corning Inc.'s 'gorilla glass'

CORNING -- Corning Inc. and Motion Computing Inc. of Austin jointly announced Tuesday the Texas computer maker would use Corning's Gorilla specialty glass in two of its laptop notebook computers.

The Texas firm, which specializes in durable and lightweight computers, is telling consumers about the specialty glass as part of its marketing.

Corning's glass improves screen durability without adding weight to the device, both companies said.

Gorilla Glass is a thin-sheet glass designed to function as a cover glass for portable display device screens. Its composition allows a deeper layer of chemical strengthening than is possible with other strengthened glasses. The end result, said Mark Matthews, vice president of Corning's Technical Materials division is a display glass cover that is scratch resistant and nearly unbreakable.

When Corning Inc.'s high-strength Gorilla glass was developed for auto windshield glass in the 1960s, it wound up on the shelf because of concerns an unbreakable windshield would cause more head injuries that it would prevent.

But about two years ago, Corning started hearing from its customers about their need for a stronger glass.

Gorilla glass was dusted off, reintroduced to the marketplace and is now expected to bring in close to $100 million in sales this year, Matthews said.

Motion's two notebook PCs that use Corning's specialty glass were designed for mobile use in the health care, construction, field service or manufacturing industries.

Trailcon Leasing, an Ontario-based trailer rental company, ordered several more computers after learning about the availability of Corning's Gorilla glass, Motion's IT manager, Stuart Innes, said.

Matthews said 12 major brand names -- including Dell, Samsung, LG and Motorola -- manufacture about 30 devices between them that utilize Gorilla glass. But Motion Computing, Matthews said, is the only company that mentions Corning's patented trademark Gorilla glass in the marketing of its PC notebooks.

He also said Corning is seeking out industrial uses for its high-strength display glass.

"The glass isn't part of the display but it protects the display," said Matthews. "So banking machines, vending machines, GPS units, you can use the glass on any devices with screens you touch."

Tuesday, October 6, 2009

The Evolution Of Technology In Schools

Schools try to keep up with the current technology trends, especially in Silicon Valley, the home of technology innovation. You would think that schools in Silicon Valley would be the most up to date on technology?with the latest computers, projectors, drawing boards?but coming from a first hand perspective, as a student at a local school, it's the complete opposite. I go to a high school where there are no technology classes that even teach students the basics of web development, or video production, or anything of that matter.

Our school just upgraded our computer labs to brand new computers, Windows XP machines, that of course, block Facebook, YouTube, and all those other good "time wasting" sites. Just this year, all the teachers' computers got connected to projectors so that teachers can show presentations, documents, etc. Also this year, our school finally got WiFi, but it is password protected and not open to students.

The restrictions on the use of school computers and the internet, are in my opinion, extreme. Each night all student accessible computers are wiped completely, and restored with all the basic programs ¿ Mozilla Firefox, IE6, Microsoft Office 2003. I understand the need for schools to protect local machines from viruses and spyware, but I feel like school policy is too extreme when it comes to blocking YouTube, Facebook, and other sites. These sites can be "time wasting" sites, but there are occasions when the sites are useful. I was the Technology Editor for my school newspaper last year, where we needed to get pictures and information from fellow students. We used Facebook chat and messages to communicate with other students to get information, to co-ordinate and to find things such as video from events.
ad_icon

In a neighboring high school, they have a full video production studio for daily video announcements ¿ yet at most other schools, such as my own, we are stuck with old PA systems for announcements and old technology along with a restricted web experience. This is not what the rest of the world would imagine for a school in Silicon Valley. A friend's school in Los Angeles has a full Mac computer lab for video and graphic work. My school ? one Mac, and it's not even allowed to be used by students. Given that in most of the developed world most schools, especially public schools, lag with technology ¿ but it seems that even when there is a will and a budget to implement new technology the policies are still outdated.

A “Supreme” (and educational!) new computer game

An article in today’s Washington Post details Our Courts, the civic education project former Supreme Court Justice Sandra Day O’Connor has been working on since her retirement from the bench.

I visited the Our Courts Web site to play the Supreme Decision game this afternoon, and came away thoroughly impressed. This particular game (there are two on the site) takes players through a hypothetical legal case about a student’s right to wear a certain t-shirt in school by (1) showing them all sides of arguments, complete with animated lawyers for each side being questioned by animated Justices, and the Justices deliberating with one another in their private chambers; (2) asking them questions about those arguments to ensure they understand the legal arguments presented; and (3) allowing them to make their own decision and thus cast the deciding vote (of course!) in the case. The game is designed for 7th and 8th graders, but although it wasn’t challenging, it held my interest for the 10 minutes or so it took to complete it.

I suggest that parents introduce their kids to this informative and fun resource at home, or tell their kids’ teachers about it. This game certainly would have been helpful when I was teaching my American Government course last year!

Monday, October 5, 2009

Microsoft hopes Windows 7 makes you forget about Vista The new operating system is more consumer friendly and includes several upgrades, including a l

Microsoft Corp.'s new Windows 7 computer operating system hopes to pull off a major trick with memory.

Not computer memory, but ours.

It's supposed to make us forget Vista.

The Vista operating system, which Windows 7 will officially replace later this month, had a terrible reputation almost from the time it debuted in 2007.

Because of Vista's technical foibles, sluggish operation and inability to play nicely with some other programs, consumers and professionals shunned it in droves, refusing to update from Microsoft's old, reliable XP operating system.

Apple Inc. made fun of Vista in a set of hilarious TV commercials, and Microsoft struck back meekly with ads that proclaimed Vista wasn't as bad as you thought.

The Windows 7 upgrade, which will sell for $119 for the Home Premium consumer version, is a chance at redemption. But it's also a campaign to head off the first real competition Windows has ever had in the PC field.

Next year, Web giant Google Inc. will introduce its first operating system, Chrome OS. Because it will be a so-called cloud computing system -- with many of its operations living on the Internet -- it's already hyped to be extremely fast, with the ability to constantly evolve.

Like Windows, Chrome OS will work on PCs. But unlike Windows, it will be free.

At first, Chrome OS will be just for the small laptops known as netbooks. But if it is successful and is expanded to full-size laptops and desktop computers, it could be a formidable challenger.

Which is perhaps why, although there is nothing revolutionary about Windows 7, Microsoft has striven mightily -- and in some ways successfully -- to at least catch up with and foresee the competition when it comes to user friendliness.

A prime example is its computer search function, which is frustratingly slow on Vista and previous Microsoft operating systems.

The new search on Windows 7 is a tremendous improvement in that it can almost instantly find a word or phrase anywhere on the computer, whether in documents, e-mails or even the names of photos and songs.

But outside of Microsoft, that kind of search is nothing new. Google Desktop gave PC users the ability to do it starting in 2004. And Apple made lightning-fast search part of its operating system in 2005. Finally, and perhaps not coincidently, Windows has gotten up to speed.

Microsoft is also looking forward by updating its enhancements for tablet computers that use touch screens. Indeed, the company says that on the day Windows 7 officially debuts -- Oct. 22 -- several manufacturers with which it works will introduce PC-based tablet computers.

That's a proactive move to head off Apple, whose long-rumored tablet is reportedly in development for introduction early next year.

Microsoft has already made copies of Windows 7 available to the press, which is a sign of either hubris or confidence.

Luckily for the software giant, a preliminary look at the operating system hints mostly at the latter.

Windows 7 has a look that resembles Vista, which is to say ugly. But early testing by me and a couple of colleagues found the new operating system to be faster and less troublesome than its predecessor.

It's also not nearly as bulky. The standard version of Windows 7, unlike Vista, will run on netbook computers.

Sunday, October 4, 2009

ORNL to Use NVIDIA Fermi to Build Next Gen Super Computer

Fermi supercomptuer will be ten times more powerful than today's fastest supercomptuer

NVIDIA was at the forefront of the push to move high-performance computing from CPUs to GPUs in scientific and other areas of research. As it turns out, the GPU is a very effective tool for running calculations historically run on the CPU.

NVIDIA announced its new Fermi architecture at its GPU Technology Conference recently. The new architecture was designed from the ground up to enable a new level of supercomputing using GPUs rather than CPUs. At the conference, Oak Ridge National Laboratory (ORNL) associate lab director for Computing and Computational Sciences, Jeff Nichols, announced that ORNL would be building a next generation supercomputer using the Fermi architecture.

The new supercomputer is expected to be ten times faster than today's fastest supercomputer. Nichols said that Fermi would enable substantial scientific breakthroughs that would have been impossible without the technology.

Nichols said, "This would be the first co-processing architecture that Oak Ridge has deployed for open science, and we are extremely excited about the opportunities it creates to solve huge scientific challenges. With the help of NVIDIA technology, Oak Ridge proposes to create a computing platform that will deliver exascale computing within ten years."

ORNL also announced at the conference that it would create a Hybrid Multicore Consortium with the goal of working with developers of major scientific codes to prepare the applications for the next generation of supercomputers using GPUs.

“The first two generations of the CUDA GPU architecture enabled NVIDIA to make real in-roads into the scientific computing space, delivering dramatic performance increases across a broad spectrum of applications,” said Bill Dally, chief scientist at NVIDIA. “The ‘Fermi’ architecture is a true engine of science and with the support of national research facilities such as ORNL, the possibilities are endless.”

Friday, October 2, 2009

Pioneer BDR-205 Blu-ray Disc Computer Writer


Pioneer Launches First 12x Blu-ray Disc Writer

High-Speed 12x Write Capability Lets Professional Users and Enthusiasts Quickly Test, Author and Preserve High Definition Content

LONG BEACH, Calif.--(BUSINESS WIRE)--Pioneer Electronics (USA) Inc. today announces its new BDR-205 Blu-ray Disc® Computer Writer, the industry’s first and fastest model to feature up to 12x write speed for single and dual-layer Blu-ray Disc (BD) media*. Ideal for authoring providers and system builders, the Pioneer® BD/DVD/CD Writer provides accurate, rapid performance for demanding professional applications.

“The BDR-205 drive represents our ongoing efforts to combine Pioneer’s engineering expertise with advanced technologies, and our next generation Blu-ray Disc writer is a great example of our no-compromise approach to optical disc product development,” said Steve Cohn, director of optical disc sales for Pioneer Electronics (USA) Inc. “We are bringing to market the fastest Blu-ray Disc writer to date, and it is just one of the many ’firsts’ that have come to define Pioneer’s 30-year heritage in optical disc innovation and leadership.”

Dynamic Performance for an Array of Professional Users

When utilized with a properly configured PC, the drive’s Low Vibration Mechanism Design improves overall writing accuracy, especially for those preserving copious amounts of critical data. Designed for maximum flexibility, Pioneer’s writer provides significant solutions for multiple user groups, including:

* System builders can confidently recommend the sophisticated BDR-205 to their clients, noting the drive’s unique design, robust build quality and high grade parts
* With 50Gbytes** of storage space on a dual-layer Blu-ray Disc, professional users can utilize the BDR-205 to rapidly test high definition feature films during the authoring process, as well as to back up large volumes of data with ease
* Besides up to 12x write speeds for Blu-ray Disc media, Pioneer’s new computer drive also provides read and write speed performance up to 16x for DVD and 40x for CD media

The new BDR-205 Blu-ray Disc computer writer begins shipping October 2009. The retail version of this product, the BDR-2205, will be available Q1 2010 for $249 MSRP.

Pioneer has been an innovator of optical disc technology since it shipped its first LaserDisc products, the precursor to DVD, to the consumer market in 1980. Pioneer went on to introduce the first DVD writer for video authoring use in 1997, the first DVD recorder as a VCR replacement in 1999, the first DVD/CD writer for home computer users in 2001 and the first Blu-ray Disc writer in 2006. Pioneer Corporation is one of the original Blu-ray Disc Founders. More details can be located at www.pioneerelectronics.com.


BDR-205 Specifications Table

Disc Media Write Speed (max.) Read Speed (max.)
BD-R SL 12x1 8x
BD-R DL 12x2 8x
BD-RE 2x 8x
BD-RE DL 2x 6x
BD-ROM SL/DL - 8x
DVD-R/+R SL 16x 16x
DVD-R/+R DL 8x 12x
DVD-RW SL 6x 12x
DVD+RW 8x 12x
DVD-ROM SL - 16x
DVD-ROM DL - 12x
DVD-RAM 5x 5x
CD-R 40x 40x
CD-RW 24x 24x
CD-ROM - 40x
Dimensions (H x W x D) 1.67 x 5.83” x 7.09”
Weight (pounds) 1.8
Interface Serial-ATA (SATA)


* The drive has been tested to achieve 12x speeds with certain media brands. 12x writing on all media cannot be guaranteed.

** Gbytes refers to billion bytes

1 12x BD-R SL write speed achieved using Panasonic and Sony 6x BD-R media

2 12x BD-R DL write speed achieved using Panasonic BD-R DL media

PIONEER and the PIONEER logo are registered trademarks of the Pioneer Corporation.

BLU-RAY DISC is a registered trademark of Sony Corporation.