Micello is mapping the great indoors. The only way to chart new territory, they've found, is to start making maps detailing the innards of manmade buildings and complexes. Unlike the mapmakers of yore, who toiled for years perfecting maps of oceans, continents, and mountain ranges, today's mapmakers have found their outdoor frontiers taken away from them by satellite imaging.

Demo's biggest stars of all time13 hot products from DEMOfall '09 And this is where Micello comes in. This way, if you're stranded in an airport and craving a cup of coffee or are at a university looking for a particular lecture hall, you'll be able to look up your location on Micello and find out where you need to walk. The company's goal is to become the Google Maps of indoor spaces as its staff of six people is doggedly mapping large public indoor spaces in the United States such as shopping malls, airports and universities. The maps the company is developing even include a search engine, so you can type "coffee" into a box and have the map point out all the locations in your vicinity that sell coffee. You said today that you're making about 10 maps a day.

In this Q&A with Micello founder and CEO Ankit Agarwal, we discuss his company's passion for mapping, the use of crowdsourcing to make maps and where he plans to take Micello in the future. How many people do you have working on these maps nationwide? In all it takes someone about four hours to get one map done and each person would do around three or four maps a day. Have six people total, three people in design work and three people doing data collection using our tools. We're primarily mapping the [San Francisco-Oakland] Bay Area to start with and our initial focus has been on Bay Area colleges and shopping malls. We get the floor plan of a particular place, whether it's from someone going and taking a picture of it or the building itself gives it to us.

Where do you get your data for building these maps? We then convert the floor plan to a geo-coded, dynamic, personalizable interactive map, so that when you go to a shopping mall, the floor plan on the Micello map will interact with you. That's a somewhat basic version of the interactivity we'll be shooting for in the future. You saw in our demonstration today that we typed 'shoes' into the search engine and it found all the stores in the mall that sold shoes. In the next generation of search we're planning on making it really smart so it can get information on specific brands and models if you type them into the search engine.

For the time being we are designing the indoor maps ourselves. How does crowdsourcing play into your strategy of building these maps? Crowdsourcing comes in for updating information on the maps we've built. People can submit content describing who happens to be coming to give a lecture at a particular hall on campus, for instance, or Macy's can let people know that they're having a two-hour sale some afternoon by flagging it on the map. So people can share stories about what's happening in different locations on the maps. Looking to the future, how do you plan to monetize this application?

In the short term we plan to monetize this application through premium content. We have a short-term and a long-term focus. So to use the Macy's example again, so you want to promote a two-hour sale, you could pay to have information on it pushed out to all Micello users in the area whose profiles show they'd be interested in shopping at Macy's. In the long term we see ourselves as becoming the go-to company for designing interactive indoor maps.

U.S. immigration officials are taking H-1B enforcement from the desk to the field with a plan to conduct 25,000 on-site inspections of companies hiring foreign workers over this fiscal year. The new federal fiscal year began Oct. 1. Tougher enforcement from U.S. Citizenship and Immigration Services comes in response to a study conducted by the agency last year that found fraud and other violations in one-in-five H-1B applications. The move marks a nearly five-fold increase in inspections over last fiscal year, when the agency conducted 5,191 site visits under a new site inspection program. In a letter to U.S. Sen.

Mayorkas letter was released on Tuesday by Grassley. "[The inspection program determines] whether the location of employment actually exists and if a beneficiary is employed at the location specified, performing the duties as described, and paid the salary as identified in the petition," said Mayorkas in his letter to Grassley. Charles Grassley (R-Iowa), Alejandro Mayorkas, director of the Citizenship and Immigration Services, said the agency began a site visit and verification program in July to check on the validity of H-1B applications. Mayorkas is a former federal prosecutor who was recently appointed by President Barack Obama. Among the issues that Grassley asked for was specific information about companies that are hiring H-1B workers for jobs that didn't exist, and who, instead, are not paid until contract work is found. He was sworn in August and said since then, "I have worked tirelessly to learn of the condition of our anti-fraud efforts and other critical programs in our agency." In September, Grassley, an ardent critic of the H-1B program, asked Mayorkas to outline the steps his agency was taking in regard to H-1B enforcement.

As part of its enforcement effort, Mayorkas said the Citizenship and Immigration Services has hired Dunn and Bradstreet Inc., which provides credit reports among other services, to act as "an independent information provider" and help verify information submitted by companies hiring H-1B workers. Employers must be held accountable, and should be required to submit contracts and itineraries to prove that a job exists. Grassley, a co-sponsor of legislation that will increase H-1B program enforcement , said in a statement released today, t"If employers are hiring visa holders without actual jobs lined up, American workers are losing out. Simply having them attest that they are complying with the law isn't good enough." Immigration attorneys have seen an increase in demands for documentation from the Citizenship and Immigration Services as part of the approval process.

The storage software market showed signs of rebounding in the second quarter, but is still falling short of the pace set last year. Within the storage software market, revenue for replication products grew 5% compared with the first quarter of this year, and data protection and recovery revenue was 3% higher than in the first quarter. Worldwide, storage software vendors raked in $2.8 billion in revenue in the quarter, down nearly 10% vs. the second quarter of 2008, according to an IDC report issued last week.\ However, some positive signs emerged.

Revenue for device management and archiving software has also grown slightly since the beginning of 2009. "The storage software market is slowly starting to recover with positive growth over the first quarter of 2009," IDC analyst Michael Margossian said in a press release. Globally, revenue for external disk storage systems was $4.1 billion in the second quarter, an 18% decline year-over-year. However, IDC cautioned that growth between the first and second quarters is typical, so the year-over-year comparisons are more significant. 9 data storage companies to watch   While last week's report covers storage software, IDC this month also reported that storage hardware sales continue to struggle. The network disk storage market declined 15% year-over-year. EMC led the storage software market with 22.4% of revenue in the second quarter, ahead of Symantec (18.5%), IBM (11.5%) and NetApp (8.5%). EMC also leads the external disk storage systems market with 21.5% of worldwide revenue.

This was the third straight year enterprise storage systems revenue declined in the second quarter.

Data Robotics today released its first iSCSI SAN storage array that, like its other low-end arrays, manages itself and allows any capacity or brand of disk drive to be mixed, matched and exchanged without any downtime. The new system extends the number of Smart Volumes - Data Robotics' thin provisioning that pools capacity from all eight drives - so users can now create as many as 255 virtual storage volumes, up from 16 volumes in the current Drobo model. Data Robotics' DroboElite offers automated capacity expansion and one-click single- or dual-drive (RAID 5 or 6) redundancy for Windows, Mac and Linux machines.

The latest addition to the Drobo family of arrays is aimed at the small to mid-size business market and resellers selling into the virtual server space, according to Jim Sherhart, senior director of marketing for Data Robotics. "Virtual servers tend to use a lot of small LUNs (logical unit numbers)," said Jim Sherhart, senior director of marketing for Data Robotics. For example, if a user were to initially set up DroboElite for dual drive failure, he could switch to single-drive failure with one mouse click. The DroboElite is also able to drop from higher to lower levels of RAID with no manual intervention. Users can also change out drives, adding higher-capacity models, in 10 seconds - with no formatting required, according to Sherhart. Tarun Chachra, chief technology officer at marketing company KSL Media , has owned two Drobo USB arrays for about a year and a half. DroboElite can support VMware environments and advanced functionality including VMotion, Storage VMotion, snapshots, and high availability.

He purchased four DroboPro arrays in June for use in two offices for Microsoft Exchange replication and backups for about 16 servers. Chachra said he was impressed that he could simply go out and buy a 1TB, 7,200 RPM SATA drive for $69 and stick it in the DroboElite, saving him money on total cost of ownership on pricier SAS drives. He's also beta testing the DroboElite, which he plans to purchase for backing up his VMware servers because of its higher throughput with dual Gigabit Ethernet ports and greater number of creatable volumes. Chachra has been comparing his existing DroboPros, which can be configured with up to eight 2TB drives, to what he'd previously been using for backups: a Hewlett-Packard AiO400R array with four 500GB drives. The HP array runs the same iSCSI stack as the DroboPro, but it uses Windows 2003 Storage Server as a backup and replication application. Chachra said the DroboPro cost about $3,500 compared with the AiO400, which cost $5,219. The HP array was set up for RAID 5 right out of the box and couldn't be changed; the DroboPro offers both RAID 5 and 6 interchangeably.

The HP has forced Chachra to reboot his backup server every three days or so because it would hang up and couldn't handle bandwidth, he said. "We don't have huge IT teams looking at servers, so it's better for us to have something that can tolerate a higher driver failure rates," he said. "We also don't stock a lot of hard drives. The DroboElite also offers a non-automated thin provisioning feature called Smart Volumes that allows users to create new volumes in seconds and manage them over time by pulling storage from a common pool rather than a specific physical drive allocation. The main thing, though, is redundancy and having Exchange available all the time." "I don't know that an enterprise is going to run out and deploy this for 2,000 or 3,000 [users], but for small or mid-size shops, this is cost effective and it works as well as it should," Chachra added. Smart Volumes are also file system aware, which allows deleted data blocks to be immediately returned to the pool for future use. Geoff Barrall, CEO and founder of Data Robotics, said the DroboElite can deliver cost savings of up to 90% compared to other iSCSI SANs "by combining cost-effective hardware with robust iSCSI features." The DroboElite is currently available starting at a price of $3,499, with multiple configurations selling for up to $5,899 for a 16TB configuration (using eight 2TB drives).

Extreme Networks this week unveiled a blueprint for migrating data centers from the physical world to virtualization, and then ultimately to cloud computing. Extreme's approach seeks to eliminate virtual switching at the server level while Cisco proposes adding that element to data center servers, specifically the blade servers within its new Unified Computing System platform. At the Gartner Data Center Conference in Las Vegas, Extreme disclosed its data center evolution strategy, which is in sharp contrast to that endorsed by Cisco.

Extreme's blueprint is expected to collide with Cisco's Data Center 3.0 strategy, Force10 Networks' Virtualization Framework, elements of Juniper's Stratus project and architectures pitched by Brocade, HP and other data center switching competitors. Extreme's plan is built on four "pillars" of data center infrastructure and operations: physical, efficiency, scalability and automation/customization. It's intended to assist in evolving data centers to wide-scale virtualization and cloud computing "without forcing certain technologies or operating methodologies" on users, says Gordon Stitt, Extreme chairman and co-founder. Physical deals with network topology, switching tiers, bandwidth and performance; efficiency involves integration, support, and management of virtual machines and hypervisors; scalability revolves around switching capacity to support thousands of VMs and applications; and automation/customization supports configuration and extensibility through XML and APIs. Extreme is addressing the first pillar through its existing stackable and modular switching lines - the Summit x450 and x650 switches for top-of-rack server access applications and the modular BlackDiamond 8800 series for the core. Extreme switches are also 40/100G Ethernet "ready" through the inclusion of an expansion slot for uplinks, Stitt says. Stacking allows switch capacity and density to increase while also enabling the stacked configuration to be managed as a single switch, with aggregated links providing increased bandwidth and redundancy.

He expects Extreme to have 40/100G modules for the switches around mid-2010. For efficiency, Extreme proposes that switches become "VM aware." Extreme currently supports VMware's hypervisors with plans to add others to the mix. The differentiator, though, comes in the scalability pillar. This will allow the company's switches to dynamically track and manage VMs and apply policies as VMs move across the network. Today, vendors such as Cisco and even the leading blade server companies - IBM, HP and Dell - propose adding a software-based virtual switch to the server itself to handle the growing number of VMs. Cisco's instantiation of this is VN-Link and the Nexus 1000V software-based switch. Meanwhile, moving - or keeping - switching in the network reduces management complexity while increasing switching performance, the company says. But virtual soft switches on servers add another element and layer of management complexity to the virtual data center, Extreme asserts.

So Extreme proposes keeping switching in the switch and not moving it to the server. VEPA does this by allowing a physical end station to collaborate with an external switch to provide bridging support between multiple virtual end stations and external networks, according to a presentation on the IEEE Web site. "A hardware switching is very predictable" in performance and behavior, Stitt says. "Software or server-based switches can be unpredictable." Work on standardizing VEPA began last month, Stitt says, adding that Extreme plans to be one of the first vendors to adopt it once it is ratified. The company is a proponent of an emerging IEEE specification called Virtual Ethernet Port Aggregation (VEPA), which, among other attributes, is designed to eliminate the large number of network and virtual switching elements that need to be managed in a data center. Extreme still needs to flesh out its data center strategy with a plan around network and storage convergence, however. Regarding the latter, Stitt believes it to be an interim, perhaps short-term remedy."If FibreChannel is already installed, I wonder if FCoE would be a good choice," Stitt posits. "Ethernet will ultimately carry storage traffic.

Stitt says that will be forthcoming and will include Extreme's views and plans around technologies and standards like Converged Enhanced Ethernet and FibreChannel-over-Ethernet. Ethernet ultimately wins."

Under a legal threat from another software firm with a similar name, Acresso Software Inc. is changing its name to Flexera Software after just 19 months. Acresso sells software such as software its installation utility, InstallShield, and software license manager, FLEXnet, to software vendors and enterprises. The company will officially announce the change next Tuesday, but had already notified partners and customers on Thursday.

It was spun out of Macrovision Corp. after the unit was acquired by venture capital firm Thoma Brava Cressley in April 2008. Macrovision retained the digital rights management (DRM) apps for which it is best-known. Acresso, which the company said was derived from the Latin word "Cresco" for "to grow, increase" faced a "challenge" on its name from ERP software maker Agresso Software , said Randy Littleson, senior vice-president of marketing for Acresso. "Our executive team decided that there were better ways to invest our time and money, and that we didn't need this distraction," Littleson said. "The action we're taking will let us avoid a potential lawsuit." Acresso did not immediately return an e-mailed request for comment. It changed its company name in July to Rovi Corporation. Acresso was founded in 1980 and has annual revenue of about $475 million. That dwarfs Acresso, which has 375 employees and annual revenues of $115 million.

It also has 3,500 employees at 16 offices globally. Flexera will be the fourth name in five years facing long-time users of InstallShield, which was bought by Macrovision in 2004. Perhaps predictably, early public reaction to the new name tended towards the sarcastic. "As if the makers of InstallShield hadn't already done enough damage to their brand, let's just go change names yet again!" wrote Christopher Painter, an InstallShield consultant, on his blog yesterday. "Acresso Software is becoming Flexera Software for no apparent reason. Littleson said the company considered changing its name to Installshield, being that it is its best-known product, but ultimately came to the conclusion that it didn't represent the breadth of its application stable. Go ahead. #ScrambleMyBrands," another tweet said. He dismissed the notion, brought up by some bloggers , that the new name will cause legal trouble or just confusion with a solar and wind power company Flexera. "We're quite aware of it. We think this is very different, compared to when it was two software companies."

That's one of the reasons why it's Flexera Software," he said. "How similar are we to an energy company?

While NASA is crashing into the moon to look for ice, it's also looking for the frozen stuff here on Earth, only in a much more conventional way. The flights are part of what NASA calls Operation Ice Bridge, a six-year project that is the largest airborne survey ever made of ice at Earth's polar regions. The space agency said on Oct. 15 it will start a series of 17 flights to study changes to Antarctica's sea ice, glaciers and ice sheets. Network World Extra:10 NASA space technologies that may never see the cosmosTop 10 cool satellite projects Researchers will work from NASA's DC-8, an airborne laboratory equipped with laser mapping instruments, ice-penetrating radar and gravity instruments.

NASA said data collected from the flights will fill in data gaps between the agency's Ice, Cloud, and Land Elevation Satellite, known as ICESat, which has been in orbit since 2003, and NASA's ICESat-II, scheduled to launch no earlier than 2014. ICESat is nearing the end of its operational lifetime, making the Ice Bridge flights critical for ensuring a continuous record of observations, NASA stated. Data collected from the mission will help scientists better predict how changes to the massive Antarctic ice sheet will contribute to future sea level rise around the world, NASA stated. The payload on the DC-8 includes the Airborne Topographic Mapper, a laser altimeter that can produce elevation maps of the ice surface. The Laser Vegetation Imaging Sensor maps large areas of sea ice and glacier zones. Other instruments flying include the Multichannel Coherent Radar Depth Sounder which measures ice sheet thickness and the varied terrain below the ice.

A gravimeter will give scientists their first opportunity to measure the shape of the ocean cavity beneath floating ice shelves in critical spots of Antarctica. Because airborne observations lack the continent-wide coverage a satellite provides, mission planners have selected key targets to study that are most prone to change. A snow radar will measure the thickness of snow on top of sea ice and glaciers, NASA stated. Sea ice measurements will be collected from the Amundsen Sea, where local warming suggests the ice may be thinning. According to NASA, the Antarctic continent may be remote, but it plays a significant role in Earth's climate system. Ice sheet and glacier studies will be flown over the Antarctic Peninsula and West Antarctica, including Pine Island Glacier, an area scientists believe could undergo rapid changes.

The expanse is home to glaciers and ice sheets that hold frozen about 90 percent of Earth's freshwater - a large potential contribution to sea level rise should all the ice melt. Compared to the Arctic, where sea ice has long been on the decline, sea ice in Antarctica is growing in some coastal areas. How and where are Antarctica's ice sheets, glaciers, and sea ice changing? Snow and ice have been accumulating in some land regions in the east. West Antarctica and the Peninsula, however, have seen more dramatic warming and rapid ice loss, NASA stated.

The U.S. Department of Education is rolling out desktop encryption software in a way that links the cryptographic process to employees' government-issued Personal Identity Verification (PIV) smart cards. The system, which is based on PGP's disk encryption technology, is intended to meet government rules for safeguarding sensitive financial and personal information, says Phillip Loranger, chief information security officer at the Department of Education. "There is a large amount of financial resources we're responsible for; we are in the student-loan business and we interface with universities and colleges," Loranger says. Tying encryption to the PIV card is a novel approach that will offer stronger authentication than a simple password. The Department of Education is actually "one of the largest banks in the country, with grants, student loans and school financial requests," he says.

The agency picked PGP in part because the encryption software company is willing to do some custom development to make sure that its Whole Disk Encryption software works with the government-issued PIV smart card and Microsoft Active Directory, Loranger says. Biometrics: The human body as proof of identity The Department of Education intends to first deploy PGP's Whole Disk Encryption on all mobile computers to protect data at rest. Loranger says he's in favor of the more stringent security tied to the PIV smart cards, but he acknowledges there will be situations when end users forget their PIV cards or lose them. In such circumstances, employees won't be locked out of their computers but will be granted a temporary password they can use for 24 hours, he says.

A bank that inadvertently sent confidential account information on 1,325 of its customers to the wrong Gmail address is suing Google for the identity of the Gmail account holder. According to court documents, the bank in August received a request from one of its customers asking for certain loan statements to be sent to a third-party. The case, filed in the U.S. District Court for the Northern District of California, involves Rocky Mountain Bank of Wyoming.

An employee of the bank, responding to the request, sent the documents to the wrong Gmail address. When it discovered the error, the bank immediately sent an e-mail to the Gmail address asking the recipient to delete the previous email and the attachment. In addition to the requested loan information, the bank employee also inadvertently attached a file containing names, addresses, tax identification numbers and other details on 1,325 account holders to the same address. The bank also asked the recipient to contact the bank to discuss what actions had been taken to comply with the bank's request. When Google refused to provide any information on the account without a formal subpoena or court order, the bank filed a complaint asking the court to force Google to identify the account holder. When it received no reply, the bank sent an e-mail to Google asking whether the Gmail account was active or dormant and also what it could do to prevent unauthorized disclosure of the inadvertently leaked information.

Rocky Mountain Bank also requested that its complaint and all of the pleadings and filings in the case be sealed. U.S. District Court Judge Ronald Whyte dismissed that request, saying there was no need for the proceedings to be sealed. "An attempt by a bank to shield information about an unauthorized disclosure of confidential customer information until it can determine whether or not that information has been further disclosed and/or misused does not constitute a compelling reason," Whyte wrote last week. The bank said it hopd to prevent unnecessary panic among its customers and a "surge of inquiry from its customers." The bank argued that if the complaint and motion papers are not sealed, all of its customers would learn of the inadvertent disclosure. This is the third time in recent weeks that Google has faced a similar issue. The man alleged that the contributors to the paper had unfairly linked him to government corruption.

Earlier this month, the Associated Press reported that a resort developer in Miami had obtained a court order requiring Google to disclose the identities of anonymous contributors to an online newspaper in the Turks and Caicos Islands. In that case, Google indicated that it would disclose the data only after first informing the paper about the request and giving it a chance to appeal for the court order to be quashed. In the other incident, a court in New York compelled Google to disclose the identity of a blogger who had made disparaging comments about a Vogue model in her blog "Skanks in NYC."

The ITU Telecom World exhibition has returned to Geneva after a visit to Hong Kong in 2006 - and has brought many Asian exhibitors back with it. The booths of China Mobile, ZTE and Datang Telecom Group loom over the entrance to the main hall, alongside those of NTT DoCoMo and Fujitsu, while upstairs Huawei Technologies and Samsung Electronics booths dwarf that of Cisco Systems, which has more meeting rooms than products on display. "Ten months ago, people were urging us to cancel the event," said Hamadoun Touré, secretary-general of the International Telecommunication Union, which organizes the exhibition and the policy forum that runs alongside it. There are also signs that the way some companies are using the show is shifting. The pessimists feared that the show would attract neither exhibitors nor visitors, as companies slashed marketing budgets and cut back on business travel in the midst of the economic downturn.

The ITU still expects 40,000 visitors at this year's show; 82,000 turned up at the last Geneva event, in 2003. This year, around half the show is occupied by national pavilions: Saudi Arabia has the biggest, followed by those of Spain and Russia. While the show is noticeably smaller than previous editions - it only occupies Halls 2, 4 and 5 of the sprawling seven-hall Palexpo exhibition center, with some yawning gaps between stands, Touré is satisfied. "It's a good show, despite the crisis," he said. Other European nations, including Belgium, France and the U.K., also have pavilions, but by far the most numerous are those of the African nations: Burundi, Egypt, Ghana, Kenya, Malawi, Nigeria, Rwanda, Tanzania and Uganda. The biggest company stands are those of the Asian network operators and equipment manufacturers, with the U.S. and Western European countries keeping a low profile. Microsoft and IBM have booths, but you'd barely notice. This domination of the show floor is not down to size alone: It's also about tactics.

There were actually only three of them, but their effect was magnified by loud music and the multiple video walls on the booth. Russia deployed what looked like an army of violinists dressed mostly in sequins on its stand on Monday. China Mobile has taken a similar route, with the logo of its 3G mobile brand, Wo, swirling and pulsing hypnotically across the walls and even the ceiling of its booth. Similar exhibits fill the stands at NTT DoCoMo and Samsung. ZTE has taken a more traditional route, with glass cases full of mobile phones, modems and cellular base stations. On the Cisco booth, there are almost no products to be seen - unless you count the looming bulk of one of its TelePresence systems, linking the booth in high resolution to similar systems around the world.

This shows images of the products that can be rotated on screen to examine them from different angles - and even measured or dismantled so that prospective buyers can figure out whether they would fit in their data center. Other elements of the Cisco product range are present virtually thanks to another screen, supplied by Massachusetts-based Kaon Interactive. Like Secretary-General Touré, Cisco faced a crucial decision last year about whether to maintain a show presence in Geneva. "One year ago, it wasn't clear how many customers were going to make this trip," said Suraj Shetty, the company's vice president of worldwide service provider marketing. That's why the rest of the stand is given over to meeting rooms. "Our focus is on customer intimacy," Shetty said. However, the company realized that "this could be used as an opportunity to shift how we get contact with customers," he said. Carrier Ethernet specialist Ciena has taken a similar approach.

Like Cisco, it prefers to show products virtually, rather than physically. "Computer graphics and touch screens are more effective in these cases. Its stand, close to Cisco's and even more discreet, consists entirely of meeting rooms. That's the trend," said Ciena CTO Stephen Alexander. If you're buying bulky network or data center infrastructure, then don't expect to kick the tires at a trade show next year - although you might be able to click on them, on the booth's screen or your own.

A network of Russian malware writers and spammers paid hackers 43 cents for each Mac machine they infected with bogus video software, a sign that Macs have become attack targets, a security researcher said yesterday. Another Sophos researcher argued that Samosseiko's evidence shows Mac users, who often dismiss security as a problem only for people running Microsoft's Windows, are increasingly at risk on the Web. "The growing evidence of financially-motivated criminals looking at Apple Macs as well as Windows as a market for their activities, is not good news - especially as so many Mac users currently have no anti-malware protection in place at all," said Graham Cluley , a senior technology consultant at U.K-based Sophos, in a blog entry Thursday. In a presentation Thursday at the Virus Bulletin 2009 security conference in Geneva, Switzerland, Sophos researcher Dmitry Samosseiko discussed his investigation of the Russian "Partnerka," a tangled collection of Web affiliates who rake in hundreds of thousands of dollars from spam and malware, most of the former related to phony drug sites, and much of the latter targeting Windows users with fake security software, or "scareware." But Samosseiko also said he had uncovered affiliates, which he dubbed "codec-partnerka," that aim for Macs. "Mac users are not immune to the scareware threat," said Samosseiko in the research paper he released at the conference to accompany his presentation. "In fact, there are 'codec-partnerka' dedicated to the sale and promotion of fake Mac software." One example, which has since gone offline, was Mac-codec.com , said Samosseiko. "Just a few months ago it was offering [43 cents] for each install and offered various promo materials in the form of Mac OS 'video players,'" he said. Mac threats may be rare, but they do pop up from time to time.

Mac OS X's security has been roundly criticized by vulnerability researchers , but even the most critical have acknowledged that the Mac's low market share - it accounted for just 5% of all operating systems running machines that connected to the Internet last month - is probably enough protection from cyber criminals for the moment. In June 2008, for example, Mac security vendor Intego warned of an active Trojan horse that exploited a vulnerability in Apple's Mac OS X. Last January, a different Trojan was found piggybacking on pirated copies of Apple's iWork '09 application suite circulating on file-sharing sites. Samosseiko's paper on Partnerka can be downloaded from Sophos' site ( download PDF ).

The U.S. Secret Service is investigating a poll posted on Facebook asking people to vote on whether President Barack Obama should be assassinated. He said the poll, which went online Saturday, was taken down Monday morning after the Secret Service alerted Facebook to its presence on the site. Special Agent Ed Donovan, a spokesman for the Secret Service, said this afternoon that the agency launched a probe into the matter and currently is looking for the person who posted the poll. The poll asked Should Obama be killed? and gave users the choice of yes; maybe; if he cuts my health care; and no.

A screen shot of the poll , which was posted on the blog, The Political Carnival , shows that at some point at least 387 people had voted. Neither the Secret Service nor Facebook would say how many people voted in the poll and what the results were. Barry Schnitt, a spokesman for Facebook, was quick to point out that this was not a poll that originated from the social networking site itself. A source within law enforcement noted that while posting the poll, in and of itself, is not illegal, federal investigators can t discount the possibility that the person behind the poll has malicious intentions. The third-party application that enabled an individual user to create the offensive poll was brought to our attention this morning, wrote Schnitt in an e-mail to Computerworld . The application was immediately suspended while the inappropriate content could be removed by the developer and until such time as the developer institutes better procedures to monitor their user-generated content. The source said the Secret Service needs to interview the person to gauge his or her ultimate intent.

Dell has agreed to buy Perot Systems for around US$3.9 billion in cash, and intends to make the company its global services delivery division, the companies said Monday. It will also allow Dell, in the future, to address customer demand for next-generation services including cloud computing, said CEO Michael Dell in a conference call with analysts. The deal will allow Dell to expand its range of IT services, and potentially allow it to sell more hardware to existing Perot customers, it said.

Dell is counting on its international reach to turn Perot into a global services company, Dell CFO Brian Gladden said during the call. Around 25 percent of revenue comes from government customers, he said. Perot Systems is one of the largest services companies serving the health-care sector, from which it derives about 48 percent of its revenue, its CEO Peter Altabef said during the call. Perot is already working at increasing its international revenue: on Friday it announced a 10-year deal to outsource IT operations for Indian hospital group Max Healthcare. Over the last four quarters, Dell and Perot together had revenue of $16 billion from enterprise hardware and IT services, with $8 billion coming from enhanced services and support, Dell said. Dell's rival Hewlett-Packard expanded its own global services unit with the acquisition of EDS for $13.9 billion in May 2008. EDS was founded by H. Ross Perot, who sold the company to General Motors before going on to found Perot Systems, of which his son is now chairman.

Perot's contribution to that is relatively small: In 2008, the company reported total revenue of $2.78 billion. In after-hours trading, the stock traded at $29.70 early on Monday morning. At $30 per share, Dell's offer represented a significant premium over Friday's closing price of $17.91 for Perot Systems shares. The boards of Dell and Perot agreed to the terms of the transaction on Sunday, they said. Dell expects that overlaps between the two companies will allow it to cut Perot's costs by between 6 percent and 8 percent, Gladden said during the conference call.

Dell expects to complete the deal in its November-to-January fiscal quarter. Upon completion of the acquisition, Dell plans to make Perot Systems its services unit, and will put Altabef in charge of the unit. The services unit will fit alongside Dell's existing divisions for selling to large enterprises, government customers and small and medium-size businesses. It also expects Ross Perot Jr., chairman of the Perot Systems board, to be invited to join the Dell board of directors. Dell created the three divisions in a major reorganization of its business sales teams last December, shifting from a geographic structure to one aligned with customer types.

Four months after it modified Windows 7 to stop the Conficker worm from spreading through infected flash drives, Microsoft has ported the changes to older operating systems, including Windows XP and Vista, the company announced on Friday. Conficker copied a malicious "autorun.inf" file to any USB storage device that was connected to an already-infected machines, then spread to any other PC if the user connected the device to that second computer and picked the "Open folder to view files" option under "Install or run program" in the AutoPlay dialog. In April, Microsoft altered AutoRun and AutoPlay, a pair of technologies originally designed for CD-ROM content, to keep malware from silently installing on a victim's PC. The Conficker worm , which exploded onto the PC scene in January, snatching control of millions of machines, used several methods to jump from PC to PC, including USB flash drives.

Microsoft responded by changing Windows 7 so that the AutoPlay dialog no longer let users run programs, except when the device was a nonremovable optical drive, like a CD or DVD drive. Four months ago, Microsoft promised to make similar changes in other operating systems - Windows XP, Vista, Server 2003 and Server 2008 - but declined to set a timeline. After the change, a flash drive connected to a Windows 7 system only let users open a folder to browser a list of files. On Friday, Microsoft used its Security Research & Defense blog to announce the availability of the updates for XP, Vista and the two Server editions. Links to the download are included in a document posted on the company's support site. Microsoft issued the updates almost three weeks ago, on Aug. 25, but did not push them to users automatically via Windows Update, or the corporate patch service Windows Server Update Services (WSUS). Instead, users must steer to Microsoft's download site, then download and install the appropriate update manually.

The Windows XP update weighs in at 3MB, while the one for Vista is about 7MB. The AutoRun and AutoPlay changes debuted in the Windows 7 Release Candidate (RC), which was available for public downloading from May 4 to Aug. 20 . Windows 7 is set to go on sale Oct. 22.

Microsoft's federated identity platform passed its first SAML 2.0 interoperability test with favorable marks, signaling the end to the vendor's standoff against the protocol. 11 security companies to watch The eight-week, multivendor interoperability workout conducted by the Liberty Alliance and the Kantara Initiative also resulted in passing marks for two other first-time entrants – SAP and Siemens. Results were announced Wednesday. "The Liberty Interoperable testing was a great opportunity to verify that Active Directory Federation Services (AD FS) 2.0 is interoperable with others' SAML 2.0 implementations. Return testers Entrust, IBM, Novell and Ping Identity also passed. This should give our customers confidence that their federation deployments using ADFS will 'just work,'" says Conrad Bayer, product unit manager for federated identity at Microsoft.

The company previously supported the SAML token, but never the transport profiles of the protocol. "It is significant that Microsoft participated given their previous stance on the SAML protocol," says Gerry Gebel, an analyst with the Burton Group. "For the first product version that supports SAML, they have covered the core bases." Microsoft's interoperability testing focused on SAML's Service Provider Lite, Identity Provider Lite and eGovernment profiles. In the past, Microsoft has been dismissive of the Security Assertion Markup Language (SAML), a standard protocol for exchanging authentication and authorization data between and among security checkpoints, preferring the WS-Federation and other protocols it helped develop. The company says it plans to support other SAML profiles based on demand. In addition, it was the first test to include an international group to test the eGovernment SAML 2.0 profile v1.5. The test featured the United States, New Zealand and Denmark. "The fact that we were able to put so many new implementations through a full matrix, rigorous interoperability test speaks to the maturity of the SAML 2 protocol," says Brett McDowell, executive director of the Kantara Initiative. "And it is not just implementation; there is a tremendous amount of deployments." "Full matrix" testing means all participants must test against each other. The interoperability event featured the largest group of participants ever for the testing, which has been run twice previously.

The test was conducted over the Internet from points around the globe using real-world scenarios between service providers and identity providers as defined by the SAML 2.0 specification. ADFS 2.0 is part of a larger identity platform that includes Windows Identity Foundation and Windows Cardspace. Microsoft participated in the testing with Active Directory Federation Services 2.0 (formerly code-named Geneva), which is slated to ship later this year. Microsoft said earlier this year it would have SAML 2.0 certification before it released Geneva. ADFS 2.0 provides identity information and serves as a Security Token Service (STS), a transformation engine that is key to Microsoft's identity architecture.

The SAML profiles ADFS 2.0 supports cover the core features of federation. ADFS lets companies extend Active Directory to create single sign-on between local network resources and cloud services. The issue was noted in a report by the Drummond Group, which conducted the testing, and centered on long URL values mostly when encryption was enabled during specific operations. It wasn't all smooth sailing for Microsoft, however, as some participants reported problems using Internet Explorer 6.0 and 7.0 for SAML single sign-on, which is primarily a Web browser action. Internet Explorer does not accept URLs longer than 2,083 characters.

Microsoft tested against IE 8 and Firefox 3.5.2. While Microsoft's participation was an important milestone for the advancement of SAML, McDowell says the current testing is significant on other fronts. Testers got around the issue by using other browsers. The test marks a transition with the Kantara Initiative now taking over future tests. The level of cooperation between governments will serve as a model for other industries, he says. The group will adopt the Liberty Alliance testing methods and expand the scope of tests to include other protocols in addition to SAML. And it will build off the eGovernment profile testing as new profiles for other vertical markets, including healthcare and telecommunications, are developed. "Having countries come together and agree on a deployment profile, that is not to be understated," McDowell says. In addition, next year Kantara will pick two other protocols to test from a list made up of WS-Security, Information Card, Identity Metasystem Interoperability, OAuth and XRD. Kantara also will take cues from Project Concordia and eventually begin to test cross-protocol interoperability.

Follow John on Twitter. The next Kantara interoperability test is slated for next year.

A bank that inadvertently sent confidential account information on 1,325 of its customers to the wrong Gmail address is suing Google for the identity of the Gmail account holder. According to court documents, the bank in August received a request from one of its customers asking for certain loan statements to be sent to a third-party. The case, filed in the U.S. District Court for the Northern District of California, involves Rocky Mountain Bank of Wyoming. An employee of the bank, responding to the request, sent the documents to the wrong Gmail address.

When it discovered the error, the bank immediately sent an e-mail to the Gmail address asking the recipient to delete the previous email and the attachment. In addition to the requested loan information, the bank employee also inadvertently attached a file containing names, addresses, tax identification numbers and other details on 1,325 account holders to the same address. The bank also asked the recipient to contact the bank to discuss what actions had been taken to comply with the bank's request. When Google refused to provide any information on the account without a formal subpoena or court order, the bank filed a complaint asking the court to force Google to identify the account holder. When it received no reply, the bank sent an e-mail to Google asking whether the Gmail account was active or dormant and also what it could do to prevent unauthorized disclosure of the inadvertently leaked information.

Rocky Mountain Bank also requested that its complaint and all of the pleadings and filings in the case be sealed. U.S. District Court Judge Ronald Whyte dismissed that request, saying there was no need for the proceedings to be sealed. "An attempt by a bank to shield information about an unauthorized disclosure of confidential customer information until it can determine whether or not that information has been further disclosed and/or misused does not constitute a compelling reason," Whyte wrote last week. The bank said it hopd to prevent unnecessary panic among its customers and a "surge of inquiry from its customers." The bank argued that if the complaint and motion papers are not sealed, all of its customers would learn of the inadvertent disclosure. This is the third time in recent weeks that Google has faced a similar issue. The man alleged that the contributors to the paper had unfairly linked him to government corruption. Earlier this month, the Associated Press reported that a resort developer in Miami had obtained a court order requiring Google to disclose the identities of anonymous contributors to an online newspaper in the Turks and Caicos Islands.

In that case, Google indicated that it would disclose the data only after first informing the paper about the request and giving it a chance to appeal for the court order to be quashed. In the other incident, a court in New York compelled Google to disclose the identity of a blogger who had made disparaging comments about a Vogue model in her blog "Skanks in NYC."

SolarWinds Tuesday announced an updated product that the company says will enable IT departments to use Cisco IP SLA to better manage WAN connections, router performance statistics and VoIP metrics. It could potentially have a pretty negative impact," says Josh Stephens, head geek for SolarWinds. "That has changed a lot over the past few years and now you can configure devices in such a way that IP SLA and NetFlow don't impact the operation of the device, but still add value when it comes to network performance monitoring." The software, targeted at network engineers ideally, can understand from every point on the network how voice applications, for instance, are performing, Stephens says. View SolarWinds' Orion IP SLA Manager in Network World's Products of the Week slideshow   SolarWinds' Orion IP SLA Manager replaces the vendor's Orion VoIP Monitor and combines capabilities to track voice metrics such as jitter, latency and packet loss with visibility into Cisco's IOS IP SLA. According to Cisco, IOS IP SLAs  "use active monitoring to generate traffic in a continuous, reliable and predictable manner, thus enabling the measurement of network performance and health." SolarWinds says it decided to monitor the Cisco technology with a commercial product (the vendor already made a free IP SLA monitoring tool available) because enterprise IT managers are overcoming the traditional barriers to such Cisco tools as http://www.cisco.com/en/US/products/ps6602/products_ios_protocol_group_h... ">IP SLA and NetFlow, for instance.   "Traditionally there were key barriers to the deployment of IP SLA in customer environments.

The product can help network managers get from one tool metrics on how each site is operating from a WAN perspective as well. It tracks edge-to-edge router performance statistics that can be exported into a dashboard for quick reference as well, SolarWinds says. "Performance can vary greatly across sites," Stephens explains. "This product helps to make the process of collecting this data simple and helps network engineers better understand the performance of the networks, applications and services." Competitive products include CA's eHealth, which CA obtained via its Concord Communications buy, and tools developed by InfoVista. Because IP SLA is already built into Cisco routers, network managers can quickly generate network and services performance data to identity site-specific or WAN-related performance issues. SolarWinds Orion IP SLA Manager pricing starts at $1,495, including first year maintenance. A free 30-day trial of the product is available for download here.   Do you Tweet? Orion IP SLA requires an installation of Orion Network Performance Monitor (NPM). Pricing starts at $2,475 for Orion NPM, including first year maintenance.

Follow Denise Dubie on Twitter here.  

Dell on Monday announced a definitive agreement to purchase Perot Systems, the IT services company founded by Ross Perot in 1988. The acquisition, expected to close between November and January, greatly expands Dell's reach into the IT services and support market, particularly in government and healthcare sectors. Dell is paying $3.9 billion for Perot Systems, a 68% premium over Perot's actual stock value. Here's a look at some of the key questions related to the deal. Why does Dell think Perot is worth such a high price?

The 2007 hiring of Stephen Schuckenbrock, former COO at Electronic Data Systems, was one of several moves signaling Dell's intent to move further into the services industry. Buying Perot is part of Dell's plan to expand its footprint in the IT services market, which may be a necessity in a time when hardware sales are falling. But the Perot deal is the strongest step yet in this regard. "Over the last couple of years they have more or less created a platform for a true entrance into the service market," says Forrester analyst Paul Roehrig. "They're really over-exposed on the hardware side. "In a lot of ways, this is a natural extension of the trajectory they have been on." (Read what analysts have to say about the acquisition.) The Obama administration's attempt to expand federally funded healthcare coverage is another consideration. By purchasing Perot, Dell immediately expands its penetration into both markets, which are likely to grow, Roehrig says. "If you were betting a couple billion dollars, what industry would you bet on?" he says. "In North America and globally there's a lot of technology enablement that has to happen in those spaces." Dell officials said they are also seeing demand from customers who want to virtualize their data centers and build private cloud networks, and buying Perot will significantly bolster Dell's ability to serve those customers. Nearly half (48%) of Perot's revenue comes from the healthcare industry, and 25% of Perot's revenue comes from the government sector.

How will Perot be operated within Dell? Dell said it will combine its own services organization with Perot's. The operation will be run out of Plano, Tex., Perot's corporate headquarters, and will be led by Peter Altabef, Perot's CEO. Dell has pulled in $5.1 billion in services revenue over the last year, while Perot did $2.6 billion in business, so the combined services organization has annual revenue of nearly $8 billion. Perot Systems will become the focus of Dell's services business. What will happen to Perot's employees and leadership team? Dell officials say they expect to cut $300 million in costs out of the two companies over the next two years.

As with any acquisition, there will likely be layoffs to reduce overlap between the two companies. Top executives are staying put, with Dell saying it has reached "long-term retention agreements" with both Altobef and "critical members of his senior leadership team." Perot has 23,000 employees. The 79-year-old Perot has served as chairman emeritus of the company's board since September 2004. Perot, a former presidential candidate and a major figure in the IT services industry for nearly five decades, is also the founder of Electronic Data Systems, which was purchased by HP. (Slideshow: Techies turned politicians)  Perot's son, Ross Perot, Jr., is a former CEO of Perot Systems and is now chairman of the Perot board of directors. Is Ross Perot still involved in Perot Systems? Perot, Jr. will be considered for appointment to the Dell board after the acquisition closes, according to Dell.

The acquisition "definitely makes a statement," says Gartner analyst Dane Anderson, and gives Dell new expertise in the healthcare and government markets. Should competitors in the IT services industry be worried by the Dell-Perot combination? But the merger is not a guarantee of success. "Whether they suddenly become the next big competitor to IBM, Accenture, or HP EDS, that remains to be seen," he says. HP, having purchased EDS, does $40 billion in services revenue, Anderson says. Size-wise, Dell's services organization still pales in comparison to some competitors. IBM is even bigger with $57 billion in services revenue.

Yes, but which company might get acquired next is anyone's guess, Anderson says. With EDS and Perot now off the block, are there any other IT services firms that might be acquired? Companies like CSC and Accenture could be takeover targets, but the same could be said of numerous services vendors. "The reality is, I'm not sure anyone is out of play," Anderson says. "The issue is who has got the intestinal fortitude and cash to make things happen." HP and IBM obviously have the cash. You really have to find a way to create volume in those smaller deal sizes." Can we expect Dell to make more acquisitions? But HP already acquired EDS, and IBM has such a large services organization that further growth will be difficult to achieve. "I don't think IBM will acquire a big services company, but it's not necessarily ruled out," Anderson says. "IBM already has $57 billion, $58 billion in services revenue, so how do you grow that effectively, especially when there are many fewer billion-dollar plus deals out there?

Yes. Should Dell and Perot customers have any concerns about the merger? Dell was able to expand its ISCSI storage business last year by purchasing EqualLogic and has indicated a willingness to continue expansion through acquisition. "We're looking for more things like EqualLogic which build on strong IP and allow us to extend the significant customer reach we have," Michael Dell, chairman and CEO of the company, said in a conference call Monday. Customers should always examine the potential ramifications of an acquisition, analysts say. But on balance, customers have reason to expect that the Dell-Perot combination will provide new opportunities or at least not be harmful. "With these kinds of integrations, clients get nervous," Roehrig says. One question is whether Dell will pressure clients to use Dell hardware rather than servers and storage from Sun, HP or others.

But "customers shouldn't panic about this. In fact, they should look at it as an opportunity to see if it remains a good fit and see if they can generate additional value for their own firms based on [the combination of Dell and Perot]." Customers of service companies that have close partnerships with Dell's hardware division may have reason to be nervous, however. "If my service provider is really relying on Dell, that's something I'd worry about," Roehrig says.

Federal Communications Chairman Julius Genachowski told Congress today that the FCC would have a new plan for auctioning off a key piece of public safety spectrum by February 2010. Speaking before the House Energy and Commerce Subcommittee today, Genachowski said that plans for the spectrum, commonly referred to as the "D block" on the 700MHz band, are part of the FCC's emerging national broadband plan due to be delivered to Congress next year. But while the total spectrum bids for the 700MHz band nearly doubled congressional estimates of $10.2 billion, no bidder met the reserve price for the D block, which was originally reserved for the construction of a high-speed public safety network that would bring America's emergency response system up to date with next-generation technology. Genachowski would not provide any further details on what form a new auction for the D block would take and only said that the commission was working diligently to get the block on the market. "The challenge is in getting this right and we don't want to rush into failed auction," he said. "The D block comes up often in connection with our broadband plan but don't have anything concrete right now." The FCC had originally tried to auction off the D block as part of its auction of spectrum on the 700MHz band last year. When the auction ended, the top bid for the D block was less than half its $1.3 billion reserve price.

Frontline Wireless, a start-up carrier that had planned to bid aggressively for the public safety block, announced that it was shutting down its business just weeks before the 700-MHz auction began. In the weeks leading up to the auction, analysts at the Yankee Group predicted that the "horrendous" ownership costs of the block, whereby prospective licensees would be responsible for building out a national public safety network with 75% population coverage within four years of getting the license, would deter companies from making significant bids on the spectrum. With Frontline out of the picture, the D Block received only one significant license bid, and the fate of the spectrum has been in limbo ever since.

The iPhone's new defense - meant to prevent users from reaching phishing sites - is inconsistent at best, a security researcher said today, with some users getting warnings about dangerous links, while others are allowed to blithely surf to criminal URLs. Other experts said that the fickle feature is worse than no defense at all. But according to Michael Sutton, the vice president of security research at Sunnyvale, Calif.-based Zscaler, the new protection is "clearly having issues." At first, said Sutton, the anti-phishing feature was simply not working. "It was blocking nothing," Sutton claimed after testing iPhone 3.1's new tool Wednesday against a list of known fraudulent sites. Apple quietly added an anti-fraud feature to the iPhone's Safari browser with the update to iPhone 3.1 , released Wednesday.

By Thursday, things had improved, but just barely. "Yesterday, it started blocking some sites, for some users, but it was inconsistent. Apple relies on Google 's SafeBrowsing API (application programming interface) for the underlying data used to build anti-phishing and anti-malware blocking lists for the desktop edition of its Safari browser. Some sites are being blocked, others are not." That led Sutton to believe that the feature's functionality wasn't the issue, but how Apple updates users with a "blacklist" of malicious sites. Other browser makers, including Google and Mozilla, also use SafeBrowsing. "It appears some iPhones are getting timely updates [from Apple], but others are not, or are getting different [block list] feeds," Sutton said. "I'm feeling better about the feature than I was Wednesday, but clearly Apple is still have issues. URLs that are blocked by Safari in Mac OS X open and direct users to malicious pages [on the iPhone]." Like Sutton, James reported inconsistencies in the anti-fraud feature's effectiveness. "All we've come up with is that sometimes it works and sometimes it doesn't," said James. "This is clearly more dangerous than no protection at all, because if users think they are protected, they are less careful about which links they click." The new feature is turned on by default in iPhone 3.1; the option to turn it off is in Settings/Safari/Security, and is listed as "Fraud Warning." Sutton, although willing to concede that Apple overall is improving its security track record, bemoaned the state of mobile security in general, and the iPhone's in particular. "The greater concern to me is that we're making the same mistakes in mobile that we made on the desktop," he said. "On the desktop, security has gotten slowly better, but [with mobile] we have a fresh start.

With the [media] coverage of the problem, maybe they're resolving it, or trying to." On Thursday, researchers at Intego, a Mac-only antivirus vendor, echoed Sutton's findings. "This feature should warn users that they may be visiting a known malicious Web site and ask if they wish to continue," said Peter James, a spokesman for Intego who writes the company's Mac security blog . "However, we have extensively tested this feature, tossing dozens of phishing URLs at it, and it simply does not seem to work. I would have thought we would have learned from our mistakes, but there's virtually no protection in mobile browsers." According to research conducted by NSS Labs, which was hired by Microsoft to benchmark different desktop browsers' ability to block malware-laden sites, Safari in Mac OS X and Windows blocked only one-in-five malicious sites . Internet Explorer and Firefox, meanwhile, blocked 80% and 27%, respectively. Last month, NSS Labs attributed the disparities between Firefox, Safari and Google - all which use SafeBrowsing as the basis for their blacklists, to differences in how each browser tweaked, then applied, the lists. Google's Chrome blocked a paltry 7% of the sites.

Kai-Fu Lee, who left his job as Google's China head last week, launched an angel investment fund on Monday that aims to cultivate three to five new high-tech companies per year in China.

The fund, named Innovation Works, launched with US$115 million in investment from major IT players including Foxconn and Legend Group, the parent company of Lenovo, it said in a statement. The fund will use the money to train young entrepreneurs and help them build companies that could later be sold or listed on a stock market. It will focus on the Internet, mobile Internet and cloud computing industries in China.

The fund aims to develop 50 companies and train 500 people in 10 years, and may eventually seek a public listing itself, Lee told reporters at a briefing.

"My hope is there will be a company in these 50 that becomes a world-class company," he said.

Innovation Works will speed the growth of its companies by giving them shared resources, such as software and server room space, and potentially by helping them partner with the fund's own investors, such as Legend, Lee said.

The fund is launching in a market where entrepreneurial skills are currently scarce, Lee said. Compared to most angel investment outfits, the fund will both keep a larger cut from the sale or listing of its offspring and give them more support, Lee said. The fund may be able to spin off companies within its first year, and it will pay back its investors when it begins doing so more regularly after three to five years, Lee said.

China's Internet industry is booming as incomes rise in the country and more people get online. Official statistics say China had nearly 340 million Internet users at the end of June, more than the population of the U.S.

Other investors in Innovation Works include venture capital firm WI Harper Group and Steve Chen, a co-founder of YouTube.

Lee left Google last week after four years as the company's president of Greater China, during which he led the company's growth into a formidable player in China's online search market.

Let's say you're an enterprise that manages a large fleet of vehicles using location-based services to track cars in real time.

And let's say your telecom carrier tells you they will cut the price of your monthly LBS bill in exchange for letting them show you a brief advertisement every time you accessed the service. Would you accept their offer?

The willingness of customers to tolerate advertisements as part of their telecom services has been an issue that telecom carriers have been grappling with lately as they contemplate how to maintain their revenues and avoid being relegated to providing "dumb pipes" that only connect customers to the Web without offering any value-added services.

Carriers are looking at advertisements as a source of revenue, says ABI Research analyst Dominique Bonte, because location-based services such as vehicle fleet management, navigation and family finders haven't been as successful as carriers had hoped. In North America this year, for instance, ABI projects that LBS will account for roughly $1.5 billion in revenue, which would represent a small portion of the revenues generated by the telecom industry as a whole. Last year, for example, AT&T and Verizon generated combined revenues of more than $141 billion.

"The reason why LBS hasn't taken off as many people would have expected is the fact that it's fairly expensive to maintain,"  Bonte says. "Most of the location-based applications are available for a monthly fee. Some of the enterprise apps are around $25 extra per month per device. So far, adoption has stayed within a small segment of early adopters."

So in order to lower the costs of LBS to users, carriers are considering implementing advertisements that will soak up some of the cost of maintaining the services and thus drive down the price. Kittur Nagesh, a director of service provider marketing at Cisco, says that one of the goals of Cisco's service provider division is to help telcos upgrade their networks to better handle LBS and to offer specialized advertisements. For instance, the company has provided European carriers T-Mobile Hungary and Network Norway with its Content Services Gateway to help improve their personalized content offerings and billing capabilities. As Nagesh sees it, carriers have to upgrade their networks to offer more LBS and ads in order to keep their revenues strong.

"If the intelligence in the network is not there, then the operator will be relegated merely to a transport service," he says. "They're not seeing any revenue from Web services such as Flickr or YouTube, but the ecosystem is now ready to give them a piece of the action. I believe ISPs have a goldmine in their hands and they haven't unleashed it yet."

When asked how ISPs should best utilize advertisements to generate revenue without annoying their users, Nagesh says that it would require a system of trial and error to determine the most effective methods.

"I believe this will be a journey the industry has to take," he says. "Obviously they will have to show ads that are relevant to users in their locations. So if you're looking at highlights from a game over the network, you'll have to look at an ad first. Or maybe while you're waiting for a download to finish, the carrier gives you a 30-second ad. If the ads are relevant then the threshold of tolerance will be higher."

Gartner analyst Tole Hart expresses more skepticism about how much advertising for LBS will add to telcos' revenues, however.

"Advertising on mobile phones is a good way to generate revenues but it won't be close to what carriers make on access charges," says Hart, who estimates that global revenues from mobile advertising will total between $11 and $12 billion this year. "I certainly think it's worthwhile because eventually advertising will become a bigger piece as smartphones become commonplace. But it's not going to be huge."

Bonte is also somewhat skeptical about how much advertising will contribute to carriers' business models and says that carriers may simply have to live with less revenues than they prefer if they want to remain relevant.

"Despite the appeal of location-based advertising, it remains to be seen how much revenue can be generated from it," he says. "Certainly there is interest from customers who want to use LBS, but they simply don't want to pay too much for it… In terms of subscribers, we project that there will be around 245 million LBS users by 2014, but the question will become whether they pay for it or will it be predominantly free and supported by advertising."

Bonte thinks that that carriers' best options going forward are to either offer unlimited LBS as part of an "unlimited everything" set at a fixed monthly rate plan similar to what Sprint is currently offering for mobile voice, SMS and data services, or to open up their networks more to allow more third-party developers to create value-added applications and services more affordably. While Bonte acknowledges that the latter arrangement has been resisted by carriers for a long time, he says that they may have to do it in the end to maintain their market positions.

"It comes down to either losing out completely on what's happening or getting a piece of the pie," he says. "And it's better to be dumb pipe that has some piece of the pie than to be left out completely."

Proxim has released two new wireless broadband products, one for affordable, long-range backhaul, the second for high-capacity last-mile access.

Slideshow: Products of the Week. 

Both the Tsunami QuickBridge-8100 point-to-point backhaul radio and the Tsunami MP-8100 point-to-multipoint product are proprietary radios that use the same underlying 4G technologies in WiMAX and LTE: MIMO and OFDM.

As a result, both radios provide high capacity and long range for their prices, and non-line-of-sight connectivity. That combination is a first for any Proxim platform. The radios operate in two bands: 2.3GHz-2.5 GHz, and 4.9GHz-6GHz, both licensed and unlicensed. It's a relatively low set of frequencies, which enables longer range and better building penetration.

There are already 1Gbps radios on the market, including the 1.6Gbps Horizon Dual from DragonWave. But the Dragonwave product, designed for very high capacity backhaul, has a range of 5 to 7 miles, runs in licensed or unlicensed 11GHz-38GHz bands, and carries a price tag ranging from $12,000 to $30,000.

One of the key selling points for the new Proxim gear will be its prices: The QB-8100 starts at $6,600 and the MP-8100 at $1,550.

The QB-8100 offers data rates of 300M and 600Mbps,with throughput of 100 and 200Mbps. These backhaul links can reach over 40 miles. Latency is 1-2 milliseconds, a key requirement to be able to handle streaming traffic such as voice and video, according to Proxim President and CEO Pankaj Manglik.

The MP-8100 offers the same data rates and throughput. It can share that capacity as fixed broadband wireless access for multiple subscribers in offices and homes. It can also be used for cost-effective video surveillance deployments, according to Manglik.

Both radios have two Gigabit Ethernet ports, both which support Power-over-Ethernet. One acts as the uplink, the second as a port for local devices, such as a video camera, which can draw the power it needs from the radio unit itself.

Both use Proxim's own Wireless Outdoor Routing Protocol, and an array of common network management and security standards. Based on the same radio technology, Proxim plans eventually to introduce a 1Gbps product.

IBM today is saying one of its researchers has made it possible for computer systems to perform calculations on encrypted data without decrypting it. While that sounds somewhat counterintuitive and complicated, IBM says the breakthrough would let computer services, such as Google or others storing the confidential, electronic data of others will be able to fully analyze data on their clients' behalf without expensive interaction with the client and without actually seeing any of the private data.

The idea is a user could search for information using encrypted search words, and get encrypted results they could then decrypt on their own. Other potential applications include enabling filters to identify spam, even in encrypted email, or protecting information contained in electronic medical records. The breakthrough might also one day enable computer users to retrieve information from a search engine with more confidentiality, IBM said

IBM Researcher Craig Gentry came up with he calls "fully homomorphic encryption," which uses a mathematical system known as an "ideal lattice," that lets people to fully interact with encrypted data in ways previously thought impossible.

Using the technology could also bolster the cloud computing model where a service provider hosts the confidential data of others. It might better enable a service to perform computations on clients' data at their request, such as analyzing sales patterns, without exposing the original data.

"Fully homomorphic encryption is a bit like enabling a layperson to perform flawless neurosurgery while blindfolded, and without later remembering the episode. We believe this breakthrough will enable businesses to make more informed decisions, based on more studied analysis, without compromising privacy. We also think that the lattice approach holds potential for helping to solve additional cryptography challenges in the future, " said Charles Lickel, vice president of Software Research at IBM in a release.

According to an article on Forbes.com, Gentry's solution has a catch: It requires immense computational effort. In the case of a Google search, for instance, performing the process with encrypted keywords would multiply the necessary computing time by around 1 trillion, Gentry estimates.

IBM said two fathers of modern encryption - Ron Rivest and Leonard Adleman - together with Michael Dertouzos, introduced and struggled with the notion of fully homomorphic encryption approximately 30 years ago. Although advances through the years offered partial solutions to this problem, a full solution that achieves all the desired properties of homomorphic encryption did not exist until now.

Forty-two percent of CIOs suffered budget decreases in the first quarter of 2009, and IT shops on average slashed budgets by 4.7%, according to new research published by Gartner.

CIOs were expecting an average first quarter budget increase of 0.16% late last year, but were forced to cut costs as the economy worsened. Fifty-four percent of CIOs reported no change in their IT budget, while a scant 4% enjoyed an increase. Average declines of 7.2% were seen by those companies that reduced IT spending. Counting all companies, including those with flat budgets and increases, the average decline was 4.7%.

"CIOs reported that renegotiating vendor contracts and head count reductions were the primary focus areas for accommodating budget reductions," Gartner analyst Mark McDonald says in a press release. "CIOs report shifting more work to in-house resources and delaying capital expenditures more than reducing IT project investments."

The findings are based on a survey of 900 CIOs from across the globe, encompassing $77 billion in IT spending. The survey, conducted in March and April, was compared to results from a similar survey of 1,500 CIOs conducted from September to December.

Budget cuts spanned all types of enterprises both in terms of size, geography and industry. Healthcare organizations reported an average budget increase of 2.2%, but CIOs in every other major industry reported a decline in the first quarter, Gartner said. Ten percent budget cuts were seen in the professional services, telecommunications and high-tech sectors. An 8% budget cut was reported in manufacturing, and 4% cuts were reported at utilities and financial services organizations.

Many CIOs say further cuts in 2009 are unlikely, and that they expect the economy to recover between the first and third quarters of 2010. But they are bracing for the possibility of further budget reductions.

"The percentage of CIOs with a contingency plan for the remainder of 2009 has more than doubled compared with 2008," Gartner reports. "CIOs with additional contingency plans for 2009 are planning for the potential of renewed IT spending, as well as additional reductions. While 44% of CIOs do not believe they will need to tap into their contingency plans, those that do believe they will [expect to] do so during the next six months."

CIOs are already planning for ramping up IT spending, once the economy recovers. Increases in "IT investment projects and workforce levels" will be the top priorities for CIOs during the expected turnaround, according to the survey. "Software, hardware and infrastructure investments are also high on the CIO's agenda on the path to economic recovery," Gartner says.