Saturday, October 27, 2007

Windows XP SP3 to include some Vista features

It's not just a patch and hot fix update, says site that leaked details


A Web site that leaked details of Windows XP Service Pack 3 over the weekend claimed that the update includes several new features, including some borrowed from Windows Vista.

According to NeoSmart Technologies, Windows XP SP3 build 3205, which was released to beta testers on Sunday, includes four new features among the 1,000-plus individual hot fixes and patches that have been issued since XP2's debut three years ago.

Features backported from Vista, said NeoSmart, include Network Access Protection (NAP), an enterprise policy enforcement technology that inspects client PCs before they access a corporate network, then updates the machines if necessary or blocks them if they don't meet specified security criteria.

Other additions range from a kernel module containing several encryption algorithms that can be accessed by third-party developers, to a new Windows activation model that doesn't require users to enter a product key.

Microsoft had previously announced SP3 support for NAP, which is part of Windows Vista and will be included in the not-yet-finalized Windows Server 2008.

Windows XP SP3, which Microsoft has said will be released early in 2008, will be one more move by the developer to extend the lifespan of the six-year-old operating system. Last month, for example, Microsoft gave Windows XP a five-month reprieve by pushing back the end of retail sales and sales of XP-powered PCs by large resellers to June 30, 2008.

And last week, Microsoft debuted a new "get-legal" program that lets companies purchase large quantities of Windows XP Professional licenses through their usual resellers.

Microsoft was not immediately available for comment on the leak, or the new features touted by NeoSmart.

Read more...

Microsoft considers opening up Device Manager

Door left open to support for non-Windows Mobile devices


SAN FRANCISCO -- Microsoft Corp.'s new Mobile Device Manager faces a shortcoming because it is exclusive to Windows Mobile devices, but that might change, an executive said today.

Scott Horn, general manager of Microsoft's mobile and embedded device group, left the door open to potential future support for devices that aren't based on Windows Mobile.

"Today, we have nothing to announce," he said. "But we're looking at it, we're thinking about it. Who knows what the future brings." Horn spoke during a press lunch at the CTIA Wireless IT and Entertainment conference here.

He mentioned that Microsoft has in the past licensed Active Sync as a way to extend services to non-Microsoft devices.

Microsoft introduced System Center Mobile Device Manager at the conference. The software lets IT administrators manage and secure Windows Mobile phones. Unlike some other management products on the market, including Nokia Corp.'s Intellisync, it is only compatible with phones running the Windows Mobile operating system.

AT&T Inc., which is supporting Mobile Device Manager, has encountered inertia among IT administrators when it comes to supporting mobile devices. IT managers are worried about security and management issues, said Mike Woodward, vice president for business marketing at AT&T. He said the System Center Mobile Device Manager software and the services of Enterprise Mobile, a new company supported by Microsoft, will spur more enterprise use of mobile phones. Woodward could not offer specifics about how AT&T will support Mobile Device Manager or how it will work with Enterprise Mobile.

For now, the new software and the services from Enterprise Mobile, which helps organizations deploy and manage mobile phones, are designed for large operations. Mobile Device Manager, which is expected to come out next year, could support as many as 5,000 users. Enterprise Mobile doesn't expect to begin thinking about serving smaller businesses for another 12 to 18 months, because it will be focused on making sure it knows what large organizations need, said Steve Moore, president of Enterprise Mobile.

Read more...

Cisco to buy WiMax start-up for $330M

Navini Networks acquisition is first foray into WiMax


Cisco Systems Inc. has agreed to buy Navini Networks Inc., a developer of WiMax broadband wireless access systems, for $330 million.

The deal marks Cisco's first foray into WiMax technology. Earlier this month, Cisco wouldn't comment on reports that it planned to buy Navini, saying it had no plans to develop wireless base stations using any technology other than Wi-Fi.

Navini makes mobile WiMax wireless base stations.

Wi-Fi and WiMax are wireless networking technologies defined in standards set by the Institute of Electrical and Electronics Engineers Inc. WiMax (802.16) has a range over a hundred times greater than the older and more widely deployed Wi-Fi (802.11) family of standards.

Cisco said it is particularly interested in Navini's expertise with "smart beamforming" technologies used with multiple-input, multiple output antenna arrays, which in Wi-Fi systems allow base stations to handle much higher data throughput.

Cisco plans to fold Navini into its wireless networking business unit. It expects the acquisition, its 124th, to close by the end of January.

Read more....

Use of IP VPNs, carrier Ethernet to rise, survey says

While many companies are still using legacy technologies, such as ATM and frame relay, most plan on modernizing at least some of their locations within the next two years, according to a Current Analysis Inc. survey.

The survey, which was conducted among roughly 120 decision-makers directly involved in selecting corporate
WAN services, found 44% of companies used frame relay, while 25% used ATM. Among frame-relay users, 75% said that they planned on switching to IP VPNs within the next two years, while 46% said they planned on switching to carrier Ethernet within the next two years.

Meanwhile, 80% of ATM users said they planned to switch to IP VPNs within the next two years, while 61% said they planned to switch to carrier Ethernet within the next two years.

"Incumbent service providers are challenged as more businesses demand IP and Ethernet services, yet many of their largest enterprise customers continue to operate on aging ATM and frame-relay networks," said Current Analysis analyst David Hold. "However, our survey indicates that the majority of those legacy users are ready to make the move to next-gen services."

The survey found the major reasons companies are deciding to switch over to next-generation WANs are the need for higher bandwidth, lower costs, and a desire for bundled voice, video and data services.

Overall, a majority of firms surveyed had already implemented next-generation WAN technology, as 68% reported using IP VPNs and 67% reported using carrier Ethernet.

The survey's sample tilted toward bigger companies, as roughly two-thirds of respondents were categorized as medium to large businesses, with 36% spending over $10 million per year on telecom services, and 31% spending between $1 million and $10 million per year.
Read more....

Cracking Google's 'secret sauce' algorithm

A clue: 'pretend we're not here'; a reward: tens of millions of dollars

Rand Fishkin knows how valuable it is for a Web site to rank high in a Google search. But even this president of a search engine optimization firm was blown away by a proposal he received at a search engine optimization conference in London last month, where he was a panelist.

The topic -- Can a poker Web site rank high on a Google search using purely white hat tactics -- meaning no spamming, cloaking, link farms or other frowned-upon "black hat" practices. Fishkin answered yes, provided the site also added other marketing techniques and attracted some media attention.

The rest of the panel scoffed. "Don't bring a knife to a gunfight," one chided. After all, this is the cutthroat online gambling sector.

But one poker Web site owner was intrigued, and he later approached Fishkin. "He said, 'If you can get us a search ranking in the top five for online poker or gambling [using white hat methods], we'll buy that site from you for $10 million,'" recalls Fishkin, president and CEO of SEOmoz in Seattle. Intrigued but skeptical, Fishkin consulted other gambling site owners at the conference. "They said, 'If it really does rank there, we might be interested in paying you $10 million more.'"

Turns out, a single online gambling customer brings in at least $1,000 in revenue. With a recent Google search of "Texas Holdem Poker" yielding 1.64 million results, it's easy to see why site owners would pay millions to crack the code for Google's PageRank algorithm -- the elusive Holy Grail of online marketing.

The stakes are high for online businesses -- and Google is the formidable gatekeeper between site owners and their customers. Web sites, such as kinderstart.com, have even sued Google for what they allege are deliberate de-rankings, though none have been successful to date. Site owners are eager to get their hands on the 75% of free Google traffic that is not affected by AdSense and AdWords, Google's pay-per-click programs. With 47% market share among search engines and 3 billion search inquiries a month, Google is indeed king.

Read more....

Seven things to know about reducing risk with an e-mail archive

Not archiving your e-mail properly can land you in legal trouble and cost millions


The following is excerpted from a transcription of the Sept. 11 Wikibon.org Peer Incite Meeting, focused on issues surrounding an article titled "Architecting e-mail storage," by Wikibon community member and consultant Kashik Das. The meeting was a discussion of specific issues by four recognized subject-matter experts and Wikibon.org community members: Josh Krischer, David Floyer, Peter Burris and David Vellante.


Krischer: There is no point in compliance if you don't keep e-mail. In Germany, for example, e-mail is official business paper, and companies have to put in the footer all the company details. All business e-mail has to be kept for 10 years.

Floyer: A lot of business is done with e-mail. About 10% of an average IT user's working life is spent on e-mail; for some people, it's a lot more than that. And there are huge deposits of e-mail and of instant messaging that leave a very strong audit trail of how organizations and people have been acting. This is a good thing, and it's also a risk.

What we have found in our discussions with people is [that] the primary driver with e-mail archiving is reducing risk. It's usually top-down, either from the CEO or board or from the legal department -- legal counsel [deciding] that e-mail archiving with mitigation systems needs to be deployed.

The courts have been emphasizing that e-mail should be captured. One of the primary objectives of e-mail archiving is just literally being able to prove in a court of law that [the e-mails] have all been captured and none have been changed. The secondary requirement for an e-mail archive is that it allows the exploitation of that data to help reduce risk to the organization.

Krischer: I identify three kinds of risk: compliance requirements for preserving e-mail, the risk of punitive damages if you can't produce the e-mails in a court case and [personal] protection. For instance, in [the Enron case], some defendants [as part of their defense] showed they were ordered to do illegal things ... in e-mails from company officers. Enron investigators found a lot of relevant information in deleted and recovered e-mails. See the $1.45 billion judgment against Morgan Stanley in the Ronald Perelman case because Morgan Stanley could not reliably produce e-mails for the court.

Floyer: If you think of a spectrum of risk, at one end, you have organizations at high risk with a lot to lose, usually highly regulated, so for example banking environments or trading environments in particular. The fundamental risk is that they can be closed down if the regulators find they are not complying with the regulatory requirements -- and there are a lot of requirements in that area. Obviously, to them, reduction of risk is very important.

Vellante: There are certain industries that are regulated. For instance SEC Rule 17A came out and essentially mandated that all electronic communicationd be archived in the financial services industry -- you have to keep everything.

Floyer: At the other end, you might take for example retail operations, which have razor-thin profits and have enormous pressures on just staying in business. What's interesting was [that] the fundamental strategy was one of minimizing the risk by minimizing the number of e-mails kept. So they kept e-mail for less than a month and then got rid of them altogether. They kept [the e-mails] of only about 200 key executives of all the people in a large organization.

Is the second approach legal? The new federal rules say you have to have solid procedures in place and that
those procedures have to be kept -- reasonable procedures. Whether reasonable is getting rid of stuff after 30 days, well, time will tell. But their argument is "the less kept the better."

Burris: This raises a very interesting question. Let's talk for one second about what we mean by risk. It sounds as though in Germany there are statutory edicts that dictate what you are supposed to do from an archiving standpoint. Whereas in the U.S., there have been some edicts, but for the most part, the biggest concerns stems from what we have learned from case law over the past few years -- namely the discovery process and how that is going to work. The risk issue then becomes different in the two places. In Germany, you either are in compliance or you are not, whereas in the U.S. ... you never know because case law is going to evolve over the next few years, and some very high-priced law firms are going to find some loopholes and screw some companies in the process.

So, does that ... change the nature of risk? [It] certainly suggests that in the U.S., because of the uncertainty of how this will play out over the next few years, that this will absolutely be decided by corporate legal minds as opposed to anybody else.

Krischer: Some of the companies I surveyed a few months ago said they plan to keep all their e-mails forever. When we ask, "Why you do that?" then normally the answer is because we don't know what we may need in a few years.

1. Focus on the issue of risk when selecting the technology for the base archive.

Floyer: From an infrastructure point of view, what I've seen is sometimes people are very focused on that risk, but other times, the project gets muddied up with a large number of wish lists that get added into the project around e-mail and around disks and around lots of things.

Then, what are the risk mitigation systems that are going to be put into place? Some of those will be technology-driven: The ability to do e-discovery more quickly or completely, for example, may reduce risk. The ability to search for rogue e-mails, the ability to ensure compliance, etc. But an awful lot of what the people we talked to were talking about -- the general training, awareness, stuff like that -- are part of that project but not the responsibility of IT.

So my point is that if you focus e-mail archiving on those two things, you may well come up with a much simpler and easier type of solution than many that are on the market. This focus will tell you which things to maximize and put significant value on in these solutions and which things to discount in the context of risk.

I think some of the current "magic quadrants" that are out there put far too much emphasis on e-mail functionality and fancy systems and fancy technology, and far too little on the core reason for doing it, which is risk reduction.

Architecting the e-mail archive to be flexible, to have access to that data, is incredibly important. And I think that alone can eliminate a number of vendors from any short list. And much simpler solutions then come into play that previously would not have been considered because they don't have all the fancy bells and whistles on them.

2. Good procedures are more important than access speed.

Floyer: What was interesting for example was that from a legal risk point of view, having good procedures was much, much more important than speed of access to it. As long as you could produce [the e-mail required in a legal discovery] within 48 hours, that was fine. Speed of access was not the important criteria for reducing risk. But good procedures that could be shown in court that were being followed were much more important.

For many companies, the reason for outsourcing e-mail archiving was that the outsourcing company showed world-class procedures that they felt would be much better than their own and would hold up much better in a court of law and therefore would be reducing risk, even though the functionality of the actual solution was not as high as others on the marketplace.

So taking that risk reduction I think can significantly simplify that whole process and therefore the whole focus of IT.

3. Do not archive e-mails from before the archive was created

Floyer: That brings me to one other point. Vendors are often pushing to include historical e-mails. One of the key points of reducing risk is to ensure that you've captured all e-mails and that nothing's been changed. That is a big reduction in risk -- just being able to prove in court that it is a complete record. For historical e-mails, it is going to be very difficult to do that.

Is putting historical e-mails into an e-mail archive going to reduce risk? The answer to that is probably not. It is extremely difficult to do [and] very, very labor-intensive -- extremely disliked by the users themselves. Probably it is better to draw a line in the sand and say from this point onward, all the data is being captured in the e-mail archive. Use the current procedures to go back and look for e-discovery on a best can-do basis and don't try to solve the historical problem by putting it into an e-mail archive. It doesn't reduce risk, and it's extremely expensive.


4. Design for secure transfer from one medium to another.

Krischer: If you want to keep something for 10 years, you can't put it on the same media for 10 years. I mean theoretically you can do that, but it will cost you a lot of money. For example, because of the price erosion of disk subsystems, it is cheaper to buy a new subsystem after three to four years than it is to pay the maintenance fee for the next six years for the old one. In addition, due to constant technology developments, new subsystems will usually be more reliable, deliver better performance and require less energy. Therefore, in 10 years, at least one media change has to be done, and this migration should be designed to and audited [to prove] that nothing was deleted and nothing was modified during this migration.


5. Build to support derivative uses of the data.

Burris: What [does] it mean to build an information store that could be used by derivative applications and create derivative types of value? So, for example, [these could include data] mining activities on e-mail archives to identify pockets of expertise or pockets of activities or pockets of relationships that might have significant business value in an upcoming sales activity or a critical support issue. The storage administrators need to be sensitive when they set up that archive so that there will be derivative uses of that information. It's guaranteed that the business will find ways to use [the archive].

Floyer: This e-mail archive infrastructure ... will live for 10 years, probably more. It's very likely to have a long life because the processes and procedures around it are going to be honed in, going to be assessed by auditors, etc., and you won't want to change those very quickly. What that means is the data held in that archive should be accessible, should not be a format that, to put it crudely, is a vendor lock-in. For example, it should be in some sort of way, either database or file-based system, where you can utilize [the data] for other functions.

Link

Real Reveals Six New Bugs in RealPlayer

For the second time in eight days, new critical vulnerabilities that could be used to hijack machines have been fingered in the RealPlayer media player. The patched editions released last Friday for Windows, however, are not vulnerable to the half-dozen bugs, RealNetworks Inc. said.

Hard on the heels of the revelation that RealPlayer sported a major flaw and that the bug had been exploited by hackers who had compromised an ad server owned by 24/7 Real Media to spread malware to visitors of legitimate, trusted Web sites, Seattle-based RealNetworks Thursday posted information about the latest vulnerabilities.

All six bugs involve RealPlayer's problems parsing file formats and could be exploited by hackers who first crafted malicious files, then duped users into either opening those rigged files when they received them as e-mail attachments or visiting an attack site that hosted such files. Among the file types: .mov, .mp3, .rm, SMIL, .swf, .ram and .pl.

"Attackers can exploit these issues to execute arbitrary code in the context of RealPlayer," Symantec Corp. said in an alert Friday. "Successful attacks can compromise the application and the underlying computer."

RealNetworks said that the most up-to-date Windows editions of RealPlayer 10.5 and beta Version 11 are immune to the attacks. Those versions were released last Friday when RealNetworks fixed a flaw in an ActiveX control it installed on systems running Internet Explorer. At least one of the newest flaws can also be traced to the ActiveX control.

Unlike last week's problem, however, four of the six vulnerabilities disclosed Thursday also can be exploited on Mac and Linux machines that have RealPlayer installed. Updated editions are also available for those operating systems, with links available from the security bulletin RealNetworks posted on its site.

Copenhagen-based vulnerability tracker Secunia ApS rated the six just-revealed RealPlayer bugs collectively as "highly critical," the second-highest mark it gives. Symantec rated the bugs separately, with at least one pegged as 8.5 out of a possible 10. But RealNetworks downplayed the risk. "We have received no reports of any machines actually compromised as a result of the now-remedied vulnerabilities," the company claimed.

It can't say the same for last week's vulnerability, which was used by unknown attackers to plant a Trojan horse on PCs whose owners had visited supposedly safe Web sites. The hackers had previously hijacked an ad server operated by Internet advertising company 24/7 Real Media Inc., then infected valid ads that 24/7 served to legitimate sites. When users viewed a page with an infected ad, their Internet Explorer browser was silently redirected to a malicious page from which the Trojan was downloaded and installed.

Although Symantec posted a detailed analysis of the RealPlayer vulnerability and the use of the compromised ad server, 24/7 Real Media has not responded to repeated e-mails this week seeking comment.

LİNK

Vista vs. Leopard: Battle of the new Features

Leopard introduces lots of new apps and interface features to Mac OS X. Can Vista match up?





Let the Battle Begin!

Every time Apple rolls out a new big-cat-themed release of OS X, it manages to pack in a few interface features and useful apps that eventually make their way across the OS world. Now that Leopard is here, let's take a look through its key features and see how the built-in features in Windows Vista measure up.


Recommend this story?


Toplist