Sunday, December 17, 2006
Microsoft has already launched Vista for their business customers and a retail release is due for January 30.
Gartner said that the company is likely to focus on flexible updates rather than work on monolithic deployments of software releases.
The group further said that by 2010, 60 percent of worldwide mobile phone users will be “trackable” via an emerging “follow-me Internet” technology.
Monday, December 11, 2006
The software giant has now said that they believe that Google and Microsoft have pockets deep enough to survive in this competitive market.
Steve Berkowitz, head of Microsoft’s online services group said that Yahoo! is likely to end up becoming a smaller player relying upon third party services.
Berkowitz said: “You have to be able to invest at a level that only right now two companies in the world can invest at and that’s Google and Microsoft.”
Berkowitz added that they need to work hard on getting the web users to spend more time on their websites. We say, make your site work in Firefox and Opera and they just might do that. My experiences using MSN services is terrible on the Opera web browser. The web is platform independent. Do not force the user to use Internet Explorer and ActiveX and Windows Media Player technology.
Tuesday, December 05, 2006
Samsung Electronics joined Microsoft on Monday in launching the first mobile phone in Asia and Europe to use high-speed HSDPA wireless technology.
The companies said the phone — the Samsung Ultra Messaging i600 — was the world's thinnest 3G smartphone with a full QWERTY keyboard. It is also the first smartphone that supports Web applications like podcasts and RSS Feeder, which scans websites for updates, the companies said.
The announcement was made in Hong Kong at ITU Telecom World 2006, a major convention for the telecommunications industry.
The companies said the phone, which can connect with Wi-Fi and Bluetooth 2.0, was designed for work and play. It has two digital cameras and can be used for 3G video calls.
"The mobile population is increasingly looking to use one device that easily plugs into their life, both in and out of the office," said Pieter Knook, a senior vice president at Microsoft.
The device, powered by Microsoft Windows Mobile 5.0, uses the new mobile protocol called HSDPA, or high-speed downlink packet access, that provides faster downloads of video and streaming music. The download speed is designed to be as fast as those provided by ADSL, or Asymmetric Digital Subscriber Line, used in homes.
Tuesday, November 28, 2006
Group Policy Settings Reference Windows Vista
Sunday, November 26, 2006
OGA is meant to be a validation system that checks if a user has a legitimate copy of the software.
Windows Vista's SPP feature requires users to activate the software with a valid activation key within 30 days of purchasing the OS. If that does not happen, the OS goes into reduced functionality mode, which lets users browse the Web for an hour before the system logs them out. To browse more, users must log in again, but they will only have another hour before the process repeats itself.
Microsoft Office 2007 does have a product-activation feature that acts similar to SPP, but it is not based on validating the legitimacy of the software and it is not new to the application, Microsoft said. Office has had a product-activation feature since Microsoft Office 2000 SR1. Product activation requires the system to be activated with a product key after being started 25 times. If it is not, the application will go into reduce functionality mode.
Microsoft is going to make validation checks for Office 2007 mandatory for users of Office Update through its OGA program. Starting in January, users of Office Update will have to validate that their Office software is legitimate before they can use the service.
OGA is a sister program to Windows Genuine Advantage (WGA), launched in July 2005 as a program that automatically checks a user's version of Windows to ensure it is not counterfeit or pirated. WGA evolved into SPP becoming an inherent part of Vista.
Microsoft's antipiracy checking systems have been unpopular from the start, meeting with some resistance from users. WGA was especially unpopular at first when early bugs in its checks were tagging legitimate software as counterfeit or pirated.
Microsoft also was forced to turn off a notification feature in the WGA that sent information to Microsoft from users' PCs when some complained that the feature was acting like spyware.
Friday, November 24, 2006
A little more than a year ago I was one of the lucky few outside of Microsoft to see the inner workings of an Xbox 360. This weekend I had the chance to once again delve into the inner workings of a truly next-generation console.
Like the Xbox 360, the PlayStation 3 console comes in two flavors identified mainly by the hard drive capacity: the low end model contains a 20GB hard drive and the high end model contains a 60GB hard drive. Additionally, the 60GB version also contains an 802.11b/g adaptor for wireless internet connectivity, a flash memory reader and chrome finishes. Today we will mostly focus on the 60GB version. There are other contractors working on supplying the PlayStation 3 components, but virtually the entire console is assembled by Asustek - the same guys who make several of the Apple MacBooks.
Read the full The Sony PlayStation 3 Dissected article.
Thursday, November 23, 2006
Efforts to stamp out piracy have been with computers since it became possible to make a copy of a program and run it on another computer successfully. Anti-piracy software, code wheels, license keys, hardware dongles and more all failed in some way, either through the use of a master key code, a crack that turned trial software into the full version, removed the check for dongles, or somehow picked the lock of anti-protections.
But now that almost all computers and an increasing array of electronic devices are almost permanently connected to the Internet, or can be wirelessly Net connected in just a few seconds, anti-piracy features that are delivered and updated over the Internet are starting to change this forever.
Copies of Vista and Office 2007 installed from a friend’s CD or DVD will need a valid license key within 30 days or will enter into a ‘reduced functionality mode’, severely limited the ability to use the software. This is actually nothing new, with XP and Office 2003 already having these features for years.
But with the Windows Genuine Advantage (WGA) and the nearly 18 month old Office Genuine Advantage (OGA) program in full swing, even if pirates are able to ‘crack’ copies of Vista and Office 2007 to work without activation, if you want to get Vista and Office updates, you’ll be subjected to a Genuine Advantage check. If you don’t pass, you don’t get updates.
With Vista, it might get more serious that than. The Software Protection Program (SPP) may kick in and give you nothing but a browser screen and Internet access (if it’s already automatically on), and logging you out after an hour. You can keep logging in every hour, but with only access to one browser window, you’ll need to make your software legitimate, either by buying a license key online there and then with a credit card, or loading a licensed copy of Vista from DVD.
That’s what already happens after 30 days with Vista if you haven’t activated your copy, but if Microsoft could detect that your copy was pirated with some kind of crack, they could easily get this to activate immediately or with very little warning.
If that happens to pirates, they won’t be very happy, and what will ensue is a tit-for-tat war between pirates and Microsoft, with the pirates breaking the protections and then Microsoft identifying the pirate copy and the cycle starting again, virtually ad infinitum like a guerrilla war.
It’s Microsoft against not only the software pirate ‘rebel’ insurgents, but all those other companies offering free, cheaper or just different alternate versions, like Mac OS X against XP and Vista, Corel’s Wordperfect Suite against Office or Google’s Docs and Spreadsheets against office.
And as Microsoft is planning to license the SPP system to other companies, some of whom are building their own version of the same thing, it’s going to get a lot harder for the pirates to use their pirated software in peace. There’s also always the chance that the WGA or OGA systems are malfunctioning or hacked, and your legitimate copies are flagged as pirate copies, with this already happening earlier on during the WGA program.
But if this happened on any scale again, Microsoft would theoretically scramble to fix it as quickly as possible, especially if it was widespread, as news would leak and people would report problems to journalists if they didn’t get any positive action to fix the problem.
With much of the world’s software empires having some form of pirate user base, with Windows and Office being two big examples of software that has always been highly pirated, people used to using Microsoft software free of charge to save a few bucks will either have to put up with older versions, actually pay for a licensed copy, try a free or a cheaper alternative, or play the piracy game with Microsoft and the software coders that try to get around the protections.
The goal of a PC on every desk, while not globally fulfilled with billions yet to use a PC, has been nevertheless so successful that it has still given Microsoft billions of dollars in revenues and profits, while giving users all kinds of new capabilities even if they were intertwined with the occasional Blue Screen of Death.
One school of thought says that Microsoft would sell hundreds of millions more copies if only the software was cheaper – imagine if Windows Ultimate was only US $99. Why wouldn’t you buy it? At US $399, and $759 in Australia, it’s not hard to imagine why pirates might want to avoid paying. But in a free market, a company is free to charge what it chooses, with competition providing the incentive to keep prices low.
But even with free operating systems and alternate office suites, browsers and plenty of other free or inexpensive software out there, Microsoft is still the people’s choice with a 90%+ installed user base.
Yes, part of the user base was made up of pirate copies, but that helped, over the years, to turn Microsoft software into the most widely used standard. Of course, hundreds of millions of legitimate sales did that too, with Microsoft reaping rich rewards and becoming the world’s No.1 software company.
Now that most users are connected to the Internet, Microsoft can enforce the licensed use of their software much more easily than ever before. If you’ve pirated in the past and want to keep on using Microsoft’s latest software, well... this time around, you might just find yourself paying, whether by buying retail copies, or getting Vista and Office pre-installed (at much cheaper rates than boxed retail copies when subtracted from the cost of the hardware) on a brand new desktop or laptop PC.
Tuesday, November 21, 2006
Chizen was careful in phrasing explicit threats against Microsoft, but mentioned that Adobe is considering its options. Illegal behavior of Microsoft could be answered with a direct lawsuit or collaboration regulators, which Chizen said Adobe is pursuing at this time. He mentioned that Adobe will leave the decision about further action to the EU Commission "for now." He did not deny the possibility of a future suit against Microsoft in the interview.
Microsoft and Adobe have been arguing over the PDF export for some time. Back in June, Microsoft said that it would pull support for saving documents in PDF and XPS (XML Paper Specification) formats from Office 2007. A plug-in to export files to PDF and XPS, however appeared on Microsoft's website in early September and was updated on November 8. According to Microsoft, the software can export the formats from all Office 2007 applications, including Access 2007, Excel 2007, InfoPath 2007, OneNote 2007, PowerPoint 2007, Publisher 2007, Visio 2007 and Word 2007.
Office 2007 is scheduled to become available for business customers on November 30.
Saturday, November 18, 2006
The new products, Windows Vista, Microsoft Office 2007 and Exchange Server 2007 were released in trial packs as part of the company's business productivity platform and promise to "deliver better results faster". Similar launches are to take place throughout the Caribbean and Central American region as well as other areas around the world and on November 30, the product will be made available to Microsoft's business customers worldwide.
Retail consumers will not be able to access the software until January 2007.
In a demonstration seminar at the Kingston launch, the new software - much-improved beta versions of Windows XP, Microsoft office suite 2003 and Exchange - were described as "intelligent", "results-oriented" and "sophisticated". All three packages offer new features to help organisations simplify the way people work with each other, to find information quickly and easily, to protect and manage content, and to ultimately reduce IT costs, Microsoft said.
Windows Vista features an easy-to-use interface which makes finding information a breeze. Tool bars have been replaced by 'ribbons' and icons on these 'ribbons' give previews of documents, again rendering tasks super easy. Another unprecedented feature of the new beta software is searching for a programme or document using the desktop search tool as opposed to searching through an entire directory.
With Office 2007, the emphasis is on visual appeal and on increasing efficiency. Information runs within different applications and open programmes are presented in a trendy 3-D format. Minimised programmes running in the task bar can also be previewed with a touch of the mouse, thereby eliminating the need to open each one in a search situation.
Exchange Server 2007 includes such security and cost-saving enhancements as a built-in protection to improve the reliability of e-mail, tools to help reduce the cost of running messaging environments and a data protection component which prevents hackers from getting into systems hosted by the Exchange Server.
Even Microsoft Outlook benefited from a make-over. The new version features the user's mailbox folders, a selected message, calendar and 'to do' list, among others, all in one single view.
Microsoft's representatives are pleased with the new products and in a release issued by the office in the Caribbean and Central America, are quoted as saying: "We are meeting our innovation goals with the introduction of our flagship products - Windows Vista, Microsoft Office 2007 and Exchange Server 2007. "..These three products are easy to use as well as easier to distribute, adapt and manage. (They) connect and integrate with the software, technologies, devices and services that organisations and their partners use or might deploy in the future".
Speaking with the Observer after the launch, Microsoft's territory manager for the western Caribbean and Jamaica, Joe McKinson, said there were great benefits to be had from the new software.
"What it will do is to reinforce and enhance our (Microsoft's) position in the IT industry," he said.
"The new products will make the organisations more efficient from a productivity standpoint. There will be ease of communication, there will be less things to do to get a task done and organisations can now position themselves to be more productive, more efficient and better able to position themselves in the world of work where there is a focus embracing technology," McKinson said, in addressing the benefits to businesses.
He said the corporation's decision to host a launch in Jamaica was testimony to the fact that the technology available here is on par with that of first-world countries, including the United States.
Minister of state in the Ministry of Industry, Technology, Energy and Commerce, Kern Spencer, in addressing the launch, reiterated Jamaica's technological readiness when compared to other countries.
"Today Jamaica is ranked 54th in terms of network readiness in the World Economic Forum's Global Information Technology Report which covered some 115 economies worldwide. In that regard, we are number one in the Caribbean region," Spencer told the group of business and IT bosses.
"Our goal in the ministry is to always be on the cutting edge of technology in keeping with our vision for Jamaica to be the centre of ICT activities in the region. Although the pace is not as fast as we would like, there is no doubt that we have come a far way in a relatively short space of time," said the junior minister.
Also any company that manages dynamic content and a lot of web pages can benefit from Sitemaps. For example, if a company that utilizes a content management system (CMS) to deliver custom web content – (i.e., pricing, availability and promotional offers) – to thousands of URLs places a Sitemap file on its web servers, search engine crawlers will be able discover what pages are present and which have recently changed and to crawl them accordingly. By using Sitemaps, new links can reach search engine users more rapidly by informing search engine "spiders" and helping them to crawl more pages and discover new content faster. This can also drive online traffic and make search engine marketing more effective by delivering better results to users.
The three companies have launched a site that explains the Sitemap protocol and how to generate the Sitemap XML files.
The first version of Sitemaps, Sitemaps 0.84, was introduced by Google in June 2005. Yahoo and Microsoft will use an updated version Sitemaps 0.90.
In August this year, Google has launched called Webmaster Central, for better communication with site owners and Google Sitemaps has been renamed as Google Webmaster Tools.
"Now, website owners will be able to go to one place for alerting the search engines to their web pages, something they have been requesting for some time", said Tim Mayer, director of product management at Yahoo Search.
"Windows Live Search is happy to be working with Google and Yahoo on Sitemaps to not only help webmasters, but also help consumers by delivering more relevant search results so they can find what they're looking for faster," said Ken Moss, general manager of Windows Live Search at Microsoft. "I am sure this will be the first of many industry initiatives you will see us working and collaborating on."
Wednesday, November 15, 2006
The Sunnyvale-based company rolled out the Personal Internet Communicator - a machine designed by AMD and that used AMD processors but was built by an outside contractor - in 2004 as part of a campaign by Chief Executive Officer Hector Ruiz to get more of the world's population online.
The device, which cost $249 for the computer and a 15-inch monitor, initially was sold in India, Russia, China, Mexico and Brazil. Despite the low price, AMD said it intended to make a profit on the item.
But the company said in a filing last week with the Securities and Exchange Commission that it stopped making the machine after it failed to generate significant sales and many of the units were returned.
AMD blamed nearly $16 million in operating losses for the first nine months of 2006 on write-offs related to PIC products, according to the filing.
In a statement late Monday, AMD noted continuing partnerships with the One Laptop Per Child nonprofit group, which is researching ways to build $100 laptops for the world's poorest children, and Microsoft Corp., which is working on pay-as-you-go computing.
"(W)e are expanding what we started with the PIC, developing new business models and new technologies that will be introduced in emerging markets," the company said.
Tuesday, November 14, 2006
It's unknown whether the XML patch will fix a flaw for a zero-day exploit that was reported by security firm Secunia, Inc. in a bulletin issued Nov. 2. That vulnerability specifically targets the XMLHTTP 4.0 ActiveX Control. According to a security advisory, Microsoft is aware of hackers already carrying out exploits; the company doesn't say whether a fix will be part of the Patch Tuesday fixes or an out-of-cycle patch.
Microsoft also updated an advisory, originally issued on Oct. 31, regarding a WMI Object Broker control flaw that affects developers building projects with Visual Studio 2005. There's no indication on the Advanced Notification whether a fix for this will be included in the forthcoming patches.
Five other flaws affect Windows in general; no specific details were provided for them. The bulletin also specifies that the security roll-up will include updates to Microsoft Update, Windows Update and Software Update Services, Windows Server Update Services and its Windows Malicious Software Removal Tool.
Monday, November 13, 2006
In February, Mike Sievert, corporate vice-president for Windows Client marketing told attendees of a Merrill Lynch conference that the software giant expected 200 million new personal computers (PCs) to ship with the new operating system (OS) in the first two years.
Sievert, who was in Malaysia recently, maintains his expectations for Vista.
“Our market is so much larger now than five years ago when we introduced Windows XP (XP). We expect Windows Vista to be the fastest-adopted OS in our history,” he told StarBiz.
He acknowledged that the adoption rate of XP was slower that what the company would have liked it to be. This, he said, was because XP was introduced just one year after its predecessor Windows 2000.
“Windows 2000 was positioned as a major improvement for users of Windows NT and Windows 98 and XP came right on the heels of it. Many people perceived XP as a consumer oriented release when really, it had great benefits,” he said.
Sievert said the uptake for XP picked up after the release of Windows XP Service Pack 2.
“When we got to the fundamentals of what users was struggling with – things like security and reliability of the system – deployment of XP started to accelerate.
“This time we've had a longer gap and plenty of improvements to XP. It has been more than a decade since the Windows OS platform has undergone a revision of this magnitude,” he said.
Despite analyst comments that businesses were nervous about integrating Vista’s security features with legacy systems, Sievert assured that security was at the core of the new OS.
“Businesses are struggling with many issues, particularly safety and security of the computing environment from malware (malicious software) and protection of sensitive and confidential data,” he said.
He stressed that with Vista, security, stability and reliability had been the priority from the very beginning of the development cycle.
“For the past one and a half years we have had thousands of customers involved with us, some as beta testers and some on the technology adoption program.
“They have been able to put Vista's security features under hard trials in a variety of real world environments, different mix of business applications, network structures and hardware and this gives us a lot of confidence as we are weeks away from introducing Vista to the business community,” he said.
Another area of concern for businesses, according to Sievert, was the cost and complexity of managing networks of PCs.
He said Vista was developed to help organisations reduce the cost of deployment, management and support while offering users simplified ways of working, finding, using and sharing information.
“Deployment has traditionally been one of the biggest cost factors associated with owning and operating a network of PCs.
“We have introduced a new application compatibility toolkit much earlier in the development process to allow businesses to discover whether existing applications running on their networks were fully compatible with Vista,” he said.
In addition, Vista had been “componentised”, allowing administrators to set up components of Vista and deploy those settings to desktops throughout the network, Sievert said, adding that these features would dramatically reduce the complexity of the deployment cycle in a business environment.
Sievert said Vista contained more than 500 new group policy objects, which empowered businesses with greater control over their network set up.
“In the past, it was very hard to set group policies that provide users the flexibility they needed.
“Users have different needs – some need to install software, others attach devices. We have been able to overcome this paradox and offer users the flexibility they need with the new group policy manager in Vista,” he said.
He added that support in Vista had also been made more efficient. “One of the things we did was to enhance diagnostics throughout Vista.
“If something happens to a PC, administrators can collate the diagnostic data of the unit and find out what was happening at the time the PC started to experience problems.
“This real time information can be used to root out problems within the network and for other users before they run into it,” he said.
Last Wednesday, Microsoft declared Vista completed and set for release to manufacturing to PC and device manufacturers to finalise work on their products and applications.
Vista would be released to volume licence customers by end November, followed by worldwide availability on Jan 30, 2007.
Microsoft Malaysia will be hosting a preview of Vista, Microsoft Office 2007 and Microsoft Exchange 2007 for business and technical professionals tomorrow.
Saturday, November 11, 2006
Despite Microsoft's operating systems being the main target for hackers to engineer exploits in security, Microsoft's Platforms and Services Division head Jim Allchin said in a conference call, listened into by bit-tech, that he has no problem allowing his son to run Vista without additional anti-virus security.
Um, the rest of the world might not agree with you there, Mr. Allchin, but he points towards the extensive Parental Controls and "Address Space Layout Randomization," which modifies certain Windows components in an attempt to patch security holes that would otherwise affect all Vista users.
Still, Allcin admits the theory won't hold up until Vista's public release. "But I need to say the following: Windows Vista is something that will have issues in security, because the bar is being raised over time. But in my opinion, it is the most secure system that's available, and it's certainly the most secure system that we've shipped. So I feel very confident that customers are far better off by using Windows Vista than they are with anything that we've released before," he said.
Please don't run Vista without an anti-virus, kids. You aren't public relations people.
Wednesday, November 08, 2006
Microsoft is taking a step toward the world of virtualized software products by making Windows Server 2003 Release 2 Enterprise Edition available in virtualized format for what it's calling its Test Drive Program.
Microsoft announced its VHD (virtual hard drive) Test Drive Program on Tuesday at VMworld 2006 in Los Angeles, the third annual user group meeting of virtualization vendor VMware.
SQL Server 2005 Enterprise Edition Service Pack 1 is also available for download off Microsoft's site as a virtualized file. And early in 2007, Windows Vista will be made available for similar, 30-day trails. The goal is to let potential customers evaluate and test the products more quickly, without requiring an on-premises server set-up, says Mike Neil, Microsoft's senior director of virtualization.
The software will be made available in Microsoft's virtual hard disk format, meaning the operating system and database have been captured in a single file along with the virtual operating system of Microsoft's Virtual Server. Customers may take the file and "load it quickly in a virtual machine instead of needing hours or days to configure a physical piece of hardware and install the software," says Neil.
Microsoft is not using the latest general release of its Virtual Server software. Instead its packaging Virtual Server Release 2, Service Pack 1, a beta version, for the downloads. Because of that, the virtualized files will be able to take advantage of the virtualization hooks, or shortcuts allowing direct access of CPU hardware instead of going through the operating system, that have been built into the latest Intel and AMD chips, Neil says. The result is higher performance for its Virtual Server virturalization engine.
Microsoft is making the same virtualized files available to independent software vendors building Windows applications. They too will be able to distribute their products in the VHD virtualized file format, ready to run in a virtual machine without further configuration. Partners expected to start distributing products in VHD format by the end of the year include BEA Systems, Check Point, Network Appliance, and Platespin.
Although just a preliminary step, the move brings Microsoft and its partners into closer accord with a practice of producing "software appliances," or single file combinations of operating system, database, and application that are configured to run together without much system administrator intervention. The "appliances" are showing growing popularity in among Linux users because of the time and cost savings associated with them. VMware technology partners, for example, produce over 300 appliances using Linux.
By putting together its own virtualized, downloadable files, Microsoft has moved much closer to the software appliance approach. It's training its partners in the practice as well. By lifting the 30-day time limit, they too would be offering appliances for quick download and adoption.
Microsoft now gives away its Virtual Server virtualization engine. Over the last 11 months, it has experienced about 500,000 downloads, Neil says.
Tuesday, November 07, 2006
Read Microsoft Office 2007 Review
Monday, November 06, 2006
In a recent interview with Search Open Source, Webbink downplayed the new relationship between Microsoft and Novell, claiming that the two companies have "gone off the road a bit" and arguing that Red Hat's approach to Linux support and stronger ideological ties to open source will ensure eventual triumph. He points out that the agreement between Novell and Microsoft involves intellectual property licensing, which he says represents a contradiction for Novell and a deviation from the conventional values of the open source community. Webbink thinks that "Novell has fallen into the trap of allowing Microsoft to do exactly what it wants to do, which is to trumpet IP (intellectual property) solutions and promises." According to Webbink, a company "can be either for freedom and collaboration," or "a different approach," but Microsoft and Novell "are trying to do both." The interview asks some good questions, and it is definitely worth a read for those interested in Novell's agreement with Microsoft. Let's examine some of Webbink's arguments and see how they hold up to scrutiny.
Some of Webbink's arguments sound hyperbolic, but he makes some worthwhile points. Although many will dismiss his argument about freedom as mere rhetoric, it is worth noting that, in many cases, enterprise Linux adoption is heavily motivated by a desire for flexibility and freedom from vendor lock-in. Webbink is implying that Novell risks alienating customers if the company's intellectual property agreements with Microsoft lead to limited choice for end-users and decreased involvement of the open source community in Novell's projects. The argument is valid, but is it sound? There is no evidence yet that the intellectual property agreements will have any tangible detrimental effect. The arrangement has certainly created some controversy and uncertainty about Novell's intentions, but it is unclear at this point what sort of impact it will have on Novell's products.
Some are concerned that Novell has entered into this agreement in order to validate inclusion of Microsoft's intellectual property in Mono, the open source .NET interpreter. Webbink points out that community concerns have led the Free Software Foundation's legal advisor to question "whether or not [Novell and Microsoft's] partnership was in violation of the GNU Public License." Mono developer Miguel de Icaza has responded to community concerns by pointing out that the open source .NET implementation does not infringe on any of Microsoft's patents, that the product can still be safely included in other Linux distributions besides SUSE, and that Mono developers will continue to ensure that Mono never includes or infringes on Microsoft's intellectual property.
Citing Microsoft's attempts to fund SCO's legal assault on the open source operating system, many Linux enthusiasts are convinced that the proprietary software company's agenda is predatory and that its agreement with Novell reflects a divide-and-conquer strategy. While this may be true, I think it's more likely that Microsoft is responding to customer demand for Linux virtualization in a Windows environment. For Microsoft, Novell is the obvious choice for this sort of alliance because the company actively promotes broader adoption of .NET technology by financially supporting development of Mono. I think it would be naive to believe that Microsoft is interested in anything other than competition, but that doesn't necessarily mean that the company is still determined to destroy Linux. We have seen numerous changes at Microsoft in the past few years, and it is obvious that the company is at least starting to move towards open standards and interoperability.
How will Microsoft's agreement with Novell impact other Linux vendors? I think that it could give SUSE an edge in the virtualization arena, particularly in enterprise environments where users need to run virtual Linux instances on a Windows host. The partnership could also potentially make SUSE look like a safer choice for some companies that are concerned about intellectual property issues. Red Hat is combating that particular advantage by offering stronger indemnification. Ultimately, I think that Red Hat can adequately compete with Novell if the company can convince customers that its stronger commitment to open source ideals and distance from the patent minefield will provide users with more choice and greater flexibility. Will Red Hat be the only Linux distributor left in one year? Don't hold your breath—an alien invasion is probably more likely.
Friday, November 03, 2006
Microsoft has changed the retail license terms for Vista so that customers now may uninstall the OS from one machine and install it on another as many times as they want, the company said Thursday. The new terms do away with limitations on how many new devices to which the license can be transferred.
However, to continue to discourage piracy, Microsoft has worded the license so that it is clear that users cannot "share this license between devices."
When the new licensing was disclosed several weeks ago, power users who rebuild their computers with new components several times a year or who plan to upgrade their computers more than once in the lifetime of the OS raised a fuss. They demanded clarification from the vendor about how scenarios like these would play out under the new licensing.
According to Shanen Boettcher, a Windows general manager at Microsoft, the company thinks it's come up with an answer to placate those users without encouraging software piracy, which the original change was designed to thwart.
"We think this clarification strikes the right balance," he said. Boettcher said the piracy problem has nothing to do with "the enthusiast community that was sending me e-mails," but with people who install one licensed copy of Windows on many machines and sell those to other users.
"This is a definite improvement over the original licensing terms and I'm glad Microsoft has relented," said Don Smutny, a Windows user and software developer for a large IT company in the Midwestern U.S. However, he still is not convinced there aren't other hidden complications within Vista's license that will have to be addressed later.
The change in policy will not affect consumers who purchase their Windows license preinstalled on a PC from a hardware manufacturer. No license transfers are allowed in those cases.
Wednesday, November 01, 2006
Web 2.0 services and software emphasize the collective intelligence of the masses over the intelligence of a single expert authority and derive value through multi-user interaction rather than one-way "package and distribute" publishing. They rely on an empowered user base to facilitate rich and meaningful interaction that can be used to establish deep and viral relationships, and they provide a powerful feedback channel that allows for constant user-driven improvement.
How is Web 2.0 changing media?
Powered by the principles of Web 2.0, online communities and social networks are causing a shift in media consumption patterns away from editorially generated content to content generated by communities. This shift is happening in both the consumer space and the professional trade publishing business. These new Web 2.0 models are connecting peers, allowing them to exchange knowledge and produce information in the act of doing their jobs.
What are some examples of Web 2.0 services?
- Professional and social networking: These services give users the tools to connect with each other and communicate. In the case of professional networking, the communication is productive and focused on achieving a specific goal in the workplace or sharing professionalknowledge.
- Community Blogs: Short for "weblogs," blogs are online journals that readers can follow and comment on. Blogs have been characterized as a disruption to traditional media: Blogs are community-driven journals, written by members of an empowered community rather thanemployees of a publishing firm. In the professional arena, community blogs provide insights andadvice from actual experts on the front line that help other community members do their jobs more efficiently.
- Discussion groups: These online communities allow users to communicate in threaded
- conversations that are started when any member of the community asks a question and thenprogress towards a resolution as other members of the community respond. Professionaldiscussion groups allow users to have structured conversations that are fact-finding in nature and targeted towards specific goals such as decision making and problem solving.
- Wikis: Wikis are Web sites comprised of interconnected pages that can be easily created and edited by any member of a community. They allow for collaborative creation, storage, and organization of information. A common use of wikis within professional communities is to serve as reference guides that reflect the collective knowledge of the community.
Web 2.0 is giving marketers greater opportunity with consumers. Marketers can now "listen" to their customers, become part of the conversation, and communicate in ways that offer real value and workable solutions to achieve optimum ROI on their campaigns
How can marketers adapt?
Strategies and tools are emerging to help marketers become part of the community and capitalize on the shift in media consumption being driven by Web 2.0 concepts.
Online communities offer the ability to integrate advertising into the experience in a way that users respond to and interact with. An example of an innovative campaign within a community is Oracle's recent sponsorship of a blogcast at ITtoolbox. This sponsorship opportunity featured a respected blog author from the ITtoolbox Blogs program who served as a subject matter expert on a topic of interest to Oracle. In order to listen to the blogcast, interested users were required to provide detailed demographic information, which was qualified and provided to the sponsor. In addition to the tangible benefits of lead generation, Oracle was also able to integrate their brand into the valuable content being generated and consumed by the community.
In addition, the ability to hyper-target the highly specific content created by online communities helps marketers reach their target audience more efficiently. Because a community generates a high volume of content, targeting opportunities like contextual matching and demographic targeting help marketers reach an audience based on the detailed information available.
Advertisers are given the opportunity to enter the community conversation with a relevant offer based on the topic being discussed or the profile of the audience.
Monday, October 23, 2006
It might happen on your CCNA exam, it might happen on your production network - but sooner or later, you´re going to have to perform password recovery on a Cisco router or switch. This involves manipulating the router´s configuration register, and that is enough to make some CCNA candidates and network administrators really nervous! It´s true that setting the configuration register to the wrong value can damage the router, but if you do the proper research before starting the password recovery process, you´ll be fine...
Read the full article here.
BSCI Exam Tutorial: An Introduction To BGP.
When you´re studying for the BSCI exam on the way to earning your CCNP certification, it´s safe to say that BGP is like nothing you’ve studied to this point.
BGP is an external routing protocol used primarily by Internet Service Providers (ISPs). Unless you work for an ISP today or in the future, you may have little or no prior exposure to BGP. Understanding BGP is a great addition to your skill set - and you have to know the basics well to pass the BSCI exam.
Read BSCI Exam Tutorial: An Introduction To BGP here
Wednesday, August 23, 2006
Borland's Delphi 2006 graced my doorstep in January. After a month of using it, I can say that besides being my tool of choice for .NET apps, Delphi 2006 may also become my tool of choice for native Win32 apps. Let me explain why.
By nature of my work and by nature of my, well, nature, I tend to work over a broad range of technologies, new and old. As much as I enjoy learning and using the latest and the greatest, I don't use it for production unless it's necessary, part of a larger plan, or at least not a hindrance. As a result, the post-.NET versions of Delphi share my hard disk with Delphi 7, the last native-only version of the compiler.
As a point of reference, I fired up Borland Pascal 7 — the last DOS version of the compiler — as late as 2001, six years after Delphi 1 was first released. But that was for legacy maintenance; I had abandoned the compiler for new projects a few years earlier. I hadn't done so because I had no call for DOS or text-based apps; rather, Delphi had matured so far beyond BP7 that I missed the features of the newer iterations.
This begs the question: When will the new versions of Delphi get so good that I give up my Delphi 7? From one standpoint, that time would seem to be distant, as a scan of various want ads reveals that not a few companies are developing new software in D7. But from another perspective, Delphi 2006 (referred to as DX hereafter) is very close to the threshold where D7 starts to feel a bit antique. Certainly, DX is slower, more complex, and piggy by comparison to D7. Both crawl next to the blazing speed of Borland Pascal 7, but that's not enough to push me back to the command line.
Let's look at some of the pros and cons. For this review, we had the architect version. However, the differences between the versions have to do with the amount of modeling tools and database connectivity. The low-end version (called professional) version is still a powerful beast.
Installation is involved, of course, with about a gigabyte worth of tool dumped on your hard-drive — more, if you're not up-to-date with all the latest .NET stuff, of course. I had to go through afterwards and turn off things like SQL Server and Interbase server, which are helpfully installed and automatically activated during Delphi installation. It would have been nice to have these be presented as options rather than surreptitiously installed.
I should note here that I did one installation and a lot of testing on a laptop with a mere 512MB of RAM. This made me somewhat hypersensitive to the bulkier/laggier parts of DX.
Firing up the old gal was considerably faster than Delphi 2005 (D9). Also, though it may seem petty, the start-up screen looks a lot better than the last version's. You may still get the impression that ten pounds of crap are being shoveled into a five pound box, but at least it's a nice-looking box.
If you want an even faster startup, and you know what you're going to be developing for, you can use the specific start-ups for Win32 development, for .NET development, or for C# development. (A C++ Builder was also included, and this has since been fully upgraded to a working version available at Borland.com.) DX also started new projects quicker and with less thrashing than the other major .NET development environment.
The IDE and GUI Editing
The IDE has a more polished look than D9, though I can't quite place my finger on why. There are some nice, subtle changes in the editor (covered below) and it defaults again to the VS-style look. I never could manage to make it emulate D7's classic component-bar configuration. It's not that the old horizontal-tab version was so great, but the vertical orientation doesn't work for me at all. The more components I have, the more likely I am to want to browse them, and you have to do a lot more scrolling with the vertical form.
It's not all bad news: the tool palette is quite keyboard-friendly. If you know what you're looking for, you can switch the palette with Ctrl+Alt+P then type in the first few letters and the palette will filter. Also, the palette is context sensitive, so that when you're looking at the form designer, you see components you can place on the form, where as if you're writing source, you see components for creating specific sorts of source files.
As Delphi has evolved, it has, of course, gotten more and more complicated in terms of layout. I've always preferred the detached-SDI layout of classic Delphi to the MDI-ish feel of Microsoft products, and that's still available here. However, with five windows plus the main menu open during most development, it doesn't really make much sense anymore. DX keeps things clean, clearing the message window when it's not needed. Though the real beauty here is how little the message window is needed (which I'll talk about more in the Text Editing section below).
DX's GUI builder demonstrates subtle changes that can make a big difference in your UI design by introducing “dynamic alignment guides”. These helpful little lines stretch out from the control you're placing to nearby controls to show how things align. Not just the edges of controls, but the position of text inside the controls, which gives your UI “clean lines”. As someone who has been known to use align features and be unsatisfied with the result, I found this improved the appearance of even my quick-and-dirty forms immensely.
Connecting a data table in MySQL to an ASP.NET Web page.
Borland also introduces “design guidelines” which at first simply seemed to be a little tooltip window showing the X,Y coordinate of the control. But if you give margins to your forms, a line pops up showing you where those margins are as you move the component around. This was also (literally and figuratively) neat, though it didn't grab me the way the “dynamic alignment guides” did.
Even less interesting to me was the addition of the “form positioner.” This allows you to specify where your form will appear on the screen when your app starts. Maybe people were clamoring for this. It struck me as a bit frou-frou. There are also two new components that emulate a web-like approach to form design.
Anyway, all this stuff may seemminor, but the energy devoted to making tasks easier and making the final product better is a big part of what makes working with Delphi a joy.
Debugging is nigh painless here, which is about the best thing you can say about debugging. The watch window allows expandable objects, so you aren't stuck with a non-helpful pointer address that you have to cast. The local watch window is still here and can also show objects' member details. Generally, the debugger is crisp and responsive. I did notice, however, that leaving a live session up was a good way to hang Delphi on my laptop.
Somewhere around Delphi 6, Borland introduced their optimizing compiler, which was probably the worst thing that ever happened to debugging. Once-valid places to break and examine variables became useless as they were “inaccessible due to optimization”. Those “optimizations” are still there. (You'd think the debugger could have a little “memory” for spied variables it knows are being optimized. I'm sure that's way more complex than it sounds, but I'm a user in this context, so I can make unreasonable demands.)
I also really like the way the IDE allowed me to have my design and debug layout separate, and it generally switched between them as appropriate. It did seem reluctant to switch back from the debug layout from time to time, but I preferred to not interpret that as an editorial comment. Another much-welcomed feature is the option to set files that get opened during debugging to auto-close afterwards. I'm sure I'm not the only one who has ended up with a couple dozen open files after a particularly deep drill-down session.
Note that the SR variable's fields are laid out. Also behold the groovy new line number set-up. You can also see the "Synchronize" method underlined as though it were flawed. Delphi's attitude toward my thread objects was less than friendly.
A series of seemingly minor changes to the IDE have added up to an environment that recalls the glory days of Delphi and Borland. This is where the energy to improve the IDE seems to have been expended, with largely marvellous results. To start, we have line-numbers, numbering every tenth line and marking the rest with tasteful hashmarks (at the fives) and dots (at the ones). No more scanning the bottom of the page to figure out where you are.
As you type, the structure window instantly shows all the errors you're making (and resolving), meaning that you seldom need to invoke the speedy compiler except when you're ready to run. This could be intrusive, but it generally gives you just the right amount of feedback. The IDE keeps track of revisions (with the ability to rollback to previous versions as in D9), and alerts you to what's been changed but not saved (yellow lines) versus what's been changed and saved (green lines). Line numbers are marked, but only shown as numbers every 10.
And there are a host of new code completion features, including templates for loops that allow you to “fill in the blanks” smoothly most of the time. There are also new features for developing your own code templates, surround templates (for turning code into a block), and completing blocks, and navigating between methods. The new features are easily adapted to and quickly useful.
At its best, the IDE is a masterpiece of usability, giving you all the clues you need to work smoothly without intruding on you.
The flip-side of that, of course, is that when it fails, it's very jarring. And it fails with disturbing frequency. One of the reasons I've never been a big fan of code completion is that you sometimes end up chasing down why your code completion isn't activating rather than concentrating on your task. If you type in
MyObject.X and the methods and properties of
MyObject that begin with X don't pop up, you wonder if you've made an error, properly initialized your variables, etc. But sometimes, it just doesn't come up. Or, instead of the appropriate class member code completion window, the new generic code templates window will come up.
The same sort of thing can happen with the underline used to show you in the IDE where your errors are. Things that aren't errors can end up underlined. It's disturbing to have the IDE tell you your code is in error when you can run it and it works. Another killer, though this was common on the 512MB machine and not on the 1GB machines, is the occasionally long lag as the IDE struggles to bring up whatever thing it thinks will help.
I have a few issues with the refactoring warnings, which are a little sensitive. For example, an
I declared in a local routine — the parent of which already has an
I — generates a warning from refactorings, even though the
I is not available to the local routine (because it's declared afterwards). In a different case, hitting Enter after a
begin generated an
end, even though the
end had already been generated.
But these feed into the larger issue: “Helpers” which are misleading are worse than no helpers at all. (You can, of course, turn all these things off, but I'm still finding them useful enough that I haven't done that. Yet.) And I have a long running gripe with things that type in code for me but don't allow me to customize how that code comes out. (I like to align my
end with the block that it ends, not the
begin that started the block.)
Then there are some things that don't quite make sense to me. You can mouse over a control, and Delphi helpfuly tells you which unit it was declared in. This unit indication is marked as a link. You can even click it! But all that happens is that the link changes color (to indicate that it's been clicked, I guess). The file doesn't open, though. Sometimes, as when the declaration is in the same unit, it shows an additional link, the clicking of which actually takes you to the declaration. (One thing you could do in D6 was press F10 on a symbol and then use the cursor keys to navigate the pop-up menu. This made jumping to a declaration a mouse-free breeze. But since D7, the F10 pops up the context menu, and the cursor keys end up manipulating the system menu, uselessly.)
Lumping the minuses (bugs real and imagined) together, they don't weigh significantly against the positives, however. The text-editing experience is a compelling argument for switching—but not the most compelling. But let's look at one big minus first.
Delphi Help peaked somewhere around the late '90s. A big part of this is the seemingly compulsive changes made to the standard Windows help systems. (And it's not just Delphi. It often seems to me like the Help systems go backward rather than forward.) I mean, it's probably not their fault that the help system takes forever to come up and the text pane of the help window sometimes needs to be clicked a couple of times before it respects the mouse wheel on my laptop, but that makes it no less annoying.
At the same time, it's certainly weird that Delphi doesn't even carry its own help over from previous versions. For example, if you pull up the help for
FindFirst (to find files) in D7, you get a nice, detailed explanation complete with some code. The code hasn't substantially changed. Since Windows 3.1. The ability to get to related links is way behind the older Delphi.
This is not a minor problem. I had serious difficulty finding the topics I needed.
I was prepared to harangue a bit also about why the flyover hints can quote chapter and verse about where my code comes from, where the methods are declared originally and so on, but if I use the “context sensitive” help, I had to choose between 400 similarly named items from all the tools. Mysteriously, however, this problem vanished on my laptop after a few weeks, and never materialized on the other machines. (If I knew what had changed, I'd tell you how to fix it.)
There's quite a bit of new stuff here and, as usual, we can't cover it all. But a special call-out needs to go to the latest iteration of ECO (Enterprise Core Objects). I've studied modeling and design for a long time, and the question has always been whether you get more out of using a modeling tool than you do without. The answer is usually a qualified “yes” but with ECO, that qualification gets a lot smaller.
ECO allows you to deal at a very high level in a very smooth way, and it gives you some very easy and rather flexible solutions to problem of object persistence. It allows you to define state machines —without coding. It allows you to build your own patterns. It even, oh-by-the-way gives you an ridiculously easy way to put your data on the Web.
Together with Borland's Togetherproduct (Architect version), you have a package that's compelling for serious design with enough instant gratification to make it a fun choice even when you're writing a small program. Substantial portions of ECO III are available even in the pro version of DX—a very smart move on Borland's part to introduce everyone to this framework. ECO is strictly .NET, too, and one of the more compelling and progressive uses of the platform.
Building applications with ECO's modeling tool.
There's a ton more in the total package than I can cover in less than a book. I didn't touch on the addition to the Delphi language of methods to records (shades of the original object classes in Borland's Pascal versions 6 and 7), but they're great if you need them. I also didn't mention C#, C++, the new refactorings, or the cool little system tray component (so that you can finally get rid of that third-party add-on you've been using since Delphi 5). But Borland has always been good about packing value into their products, and this a trend that's gotten stronger in recent years.
If you've been using D8 or D9, DX is just a logical upgrade. It just feels better, faster, stabler, and more fun, even with the warts and hiccups. If you've never used a Borland IDE, you could do worse than start with this one. But what if you're a D7 user?
Yeah, I've griped a lot here. Still am griping. But the fact is, I'm using DX a lot more than any Delphi since 7. I have a bunch of Delphi 7 projects, and DX loads them almost seamlessly. I find myself less interested in going back, though — not because of the promise of .NET, but due to the super-cool features. Borland has a lot to be proud of with this release. Do yourself a favor and check it out.
Friday, June 09, 2006
- Exam 70-292 - Managing and Maintaining a Microsoft Windows Server 2003 Environment for an MCSA Certified on Windows 2000. The Microsoft Certified Systems Administrator (MCSA) on Windows Server 2003 upgrade exam is available only to candidates who are currently certified as MCSAs or as MCSEs on Windows 2000. The MCSA on Windows Server 2003 credential is intended for IT professionals who work in the typically complex computing environment of medium to large companies. More details here.
- Exam 70-293 - Planning and Maintaining a Microsoft Windows Server 2003 Network Infrastructure. The Microsoft Certified Systems Engineer (MCSE) on Windows Server 2003 credential is intended for IT professionals who work in the typically complex computing environment of medium to large companies. An MCSE candidate should have at least one year of experience implementing and administering a network operating system. More details here.
- Exam 70-294 - Planning, Implementing, and Maintaining a Microsoft Windows Server 2003 Active Directory Infrastructure. The Microsoft Certified Systems Engineer (MCSE) on Windows Server 2003 credential is intended for IT professionals who work in the typically complex computing environment of medium to large companies. An MCSE candidate should have at least one year of experience implementing and administering a network operating system. More details here.
Wednesday, May 10, 2006
Using the new DFS in Windows Server 2003 R2
R2 is an interim or "upgrade" release of Windows 2003. It is an optional upgrade, but has some very nice features such as the new DFS. Look here for more details on R2. Before we continue this discussion, it is important to note that "DFS" previously referred to shares and namespace management. Beginning with the Windows Sever 2003 R2 release, "DFS" is an umbrella term that refers to both namespaces and replication. The term "DFSR", at lease as it is used at this time, refers to the new replication engine.
Active Directory scripting secrets: When GUI just isn't enough
While it's true that Active Directory provides a number of easy, wizard-driven Graphical User Interface options to create objects and perform many common administrative tasks, to be a truly effective admin you'll often need to get away from the GUI and find a more efficient way to operate.
Extracting Active Directory info quick and easy with LDIFDE
As mature as Active Directory is, it still amazes me how many admins I talk to who have no idea how to write simple LDIFDE.exe commands to gather data for routine operations. My next few articles will give you some simple instructions on how to take advantage of this tool to gather AD data without using those painful UIs - even for the scripting impaired!
Monday, May 01, 2006
Customers are well aware of the threats they face from viruses and worms, but a survey of some 550 small and midsize businesses found that human error was the primary cause of nearly 60 percent of security breaches during the past year, said Brian McCarthy, COO of the Computing Technology Industry Association (CompTIA), Oakbrook Terrace, Ill., which sponsored the study.
“The alarming part is that little is being done to change cultural behavior,” McCarthy said. “End-user awareness [of security issues] is a big problem in companies. Organizations that provide security training to employees will see ROI.”
Brian Haboush, vice president of business development at Intelligent Connections, a Royal Oak, Mich.-based solution provider, agreed. “We find the biggest vulnerability in corporate networks to be caused by misconfiguration of equipment due to lack of training,” he said. Haboush sees increasing demand for security training and is expanding his training offerings for IT staff and for the executive ranks.
Most of the flaws that emerge in the security and vulnerability assessment realm are due to misconfigurations and poor application of corporate security practices, which points to a need for training, said Paul Rohmeyer, a professor at Stevens Institute of Technology, Hoboken, N.J., and former COO of North Brunswick, N.J.-based security solution provider Icons.
VARs should include security training in the solutions they offer to help companies effect cultural change and minimize human error, McCarthy added. “There are opportunities to bring a training solution into the equation to make sure products they have installed are fully realized,” he said.
Monday, March 27, 2006
A firewall acts as a security guard, standing watch over data as it travels in and out of your network. Yet unlike a security guard, it doesn't rely on magic passwords or secret keystrokes to determine whether to accept or deny data. Instead, it refers to a set of rules in order to determine what is allowed to pass. While a properly configured firewall can keep unauthorized users out, an improperly configured one can halt both incoming and outgoing network traffic.
Both hardware and software firewalls are effective ways to safeguard your network, but we will focus here on setting up the Cisco PIX 501 firewall, available from TechSoup Stock.
This guide will explain what makes a PIX 501 firewall different, and describe its basic configuration. It also includes helpful Cisco links, a glossary, and resources for networking with other TechSoup users and consultants.
First Thing's First: What the Heck is a Firewall?
A firewall is hardware or software that blocks communications forbidden by security policy in a networked environment. A firewall's packet filtering rules permit or deny certain users from sending or receiving certain types of information. Packet filters have no idea what type of traffic is running on those ports, only what ports and IP numbers that traffic is going to or coming from.
A special breed, the PIX firewall is more intelligent than your average packet filter. Its stateful inspection takes the TCP state of Internet traffic into consideration and allows it back in if it originated within the network. Cisco's PIX also offers additional protocol options to make sure that incoming and outgoing protocols are legitimate.
Before setting up your firewall, check to make sure that it came with the following:
- A beige PC terminal adapter
- A blue console cable
- A yellow straight-through cable
- An orange crossover cable
- A power supply
- A power cable
- Installation software
- Your PIX firewall
Setup isn't very intuitive, so check your configuration with this Cisco documentation (PDF) . (Go to Section 3, "Connect the Cables.")
There are two options for setup: console and graphical. If you want to go with a graphical setup, you'll find the PIX Device Manager -- an HTML configuration application -- bundled with PIX.
To get started with the graphical setup, point any Java-enabled browser at your internal IP address, better known as https://172.16.0.1. (Be sure to use "https" instead of "http" or the connection will fail.)
After you've entered your IP, the browser window will open, and you'll see a gray box, which is the PDM start screen. Go to the System Properties tab, expand Logging in the tree view, and select Logging Setup. Check the Enable Logging box and select Syslog in the tree view. If you've just installed PIX, then you won't see a configured server in the list by default.
To add a server, hit the Add button and start entering data into the dialog that appears. Because your syslog server is typically located on your network, select Inside from the Interface dropdown list.
Move onto the Protocol section where you'll find two radio buttons. Select UDP as the protocol. The next data entry area is port value: put "514" as the default and the standard.
Once you've configured PIX's syslog settings, click OK to get back to the main screen. At this point, your configuration hasn't been saved. To do this, hit Apply to PIX and wait. This may take awhile, depending on your configuration.
You'll also need to get those settings onto the firewall by hitting the Save to Flash Needed button on the top of the screen. If you fail to take this step, PIX may not forward syslog messages, or stop forwarding them after a reboot. Once everything has been saved, a dialog box reading "Configuration saved to flash memory" will appear.
You should now be receiving syslog messages on your confirmed syslog server. At this point, close PDM. (You can always return to it later if you need to make adjustments.)
Before changing or modifying anything, make sure you know what you're doing, as inputting the wrong data or port can disrupt your network's traffic. Full documentation on configuring your PIX with the graphical installer can be found on MonitorWare.
Those who are more comfortable using the terminal can set up PIX using a terminal emulation program to talk to the PIX on a console port. (For instructions, check out TechSoup's article A Guide to Installing a Cisco PIX 501 Firewall: Advanced Setup .)
I Need Help!
Cisco provides excellent technical documentation, so if you’re not quite sure where to start, visit one of these links for help:
- Cisco PIX Firewall Configuration Guide, Version 5.0
- Cisco's PIX Section
- Cisco's 501 Product Line
- Configuration Examples and TechNotes
- Configuration Guides
- Install and Upgrade Guides
- Cisco PIX 501 Firewall Quick Start Guide, Version 6.2 (PDF)
- Cisco PIX 501 Firewall Quick Start Guide, Version 6.3 (PDF)
Sometimes computer terms can sound a bit like gibberish. Read our glossary of important terms you may come across as you configure your firewall.
Static NAT: Also known as "one-to-one" NAT. For every one public IP, there is one private IP statically mapped to it. An organization that has exactly one public IP addresses for everyone computer could use this sort of scheme. By statically NAT-ting computers on a firewall, a network administrator could filter out certain types of inbound traffic.
Dynamic NAT: Also known as "many-to-many" NAT. Statically NAT-ting all of the IPs in your organization is not practical, as public IPs are not always very easy to get. A range of IP addresses is shared between lots of private IP addresses. If someone is surfing the Web only, there's absolutely no need for them to have their own dedicated IP address.
Overloading: Also known as Port Address Translation (PAT), "NAT overloading," or "many-to-one" NAT. Overloading describes what we did in our basic configuration example. It is used when there is only one IP address to share with many people. The unique source port determines which internal (private) IP address gets the return traffic.
Overlapping: Used when your public IP addresses "overlap" with the public IP addresses of another network. The router translates the address in order to avoid a potential conflict with this other network.
Help from TechSoup's Forums
Don't forget to tap your TechSoup community as a resource to help you set up your firewall. If you run into problems or want to help others, drop by TechSoup's Networks or Virus Vaccination and Computer Security forums and let us know what did and didn’t work for you.
Before you post, please make sure that you strip out any important identifying information (such as your public IP address). Post your configuration and a general description of your network layout and what you’re trying to accomplish and one of our moderators or forum members will do their best to point you in the right direction.
Finally, remember: While installing a hardware firewall is one of the best things you can do to secure your network from the outside, security should by no means end there. Firewalls alone cannot fully protect you from Internet threats, vengeful employees, a lack of password protection, spyware, or even viruses. Be sure to take precautions to safeguard your network from these threats, too.
Monday, March 13, 2006
The survey results were released in conjunction with the RFID World 2006 conference.
Seventy-five percent of the technology companies participating in the CompTIA survey said they do not believe there is a sufficient “pool of talent” in RFID technology to hire from. That figure is down slightly from a similar survey conducted in 2005, when 80 percent of respondents said there was a shortage of RFID talent.
Among companies that believe there is a talent shortage, 80 percent said the lack of individuals skilled in RFID will impact adoption of the technology. The figure is significantly higher than a year ago, when 53 percent of responding companies said the shortage of talent would have a negative impact on RFID adoption.
“RFID is a complex and still evolving technology, and expertise is absolutely required for its usage to be a success,” said David Sommer, vice president, electronic commerce, CompTIA. “The skill sets and “need-to-knows” related to RFID are many and varied. Clearly there is work to be done in our industry in terms of RFID education, training and professional certification.”
Sommer presented the findings of the CompTIA RFID skills survey in a presentation at RFID World 2006.
Global IT trade association CompTIA initiatives extend to areas such as convergence technologies, electronic commerce, information security, IT services, public policy, skills development, and software.
Friday, March 10, 2006
Touted by some as Microsoft's "Google killer," the new search engine was up and down for quite some time last night. I think I caught the developers uploading the UI because there were some really ugly versions before things started to settle down...
While the site was working, before it got overwhelmed with traffic (probably because the Drudge Report and other sites were flogging it) I did get a glimpse of the interface and had a chance to test some of the site out.
First impression: Big improvement over MSN Search. Google killer? No, not yet. But not bad either.
The "live.com" default interface is extremely simple (hmmm, wonder who thought of that idea?). It's after you've begun your search that the fun begins.
When a search returns results, the page mutates and, besides the the classic search text entry box (and of course, your results), a series of tabs appear that allow for more custom searching and feed harvesting.
Nothing exotic, these tabs include "web", "news", "images", "local", and "feeds."
The "web" results are very Google-like, and display returns in the now classic style: head, deck and URL. You can customise how much of this information you want with a handy little control on the right side of the interface. You can, for example, use the control (in the pop up image here you'll see it on the right side) Routine search returns were not stunning better (or worse) than Google. In fact, in my very early tests, the results from both sites were pretty close.
The strongest part of the tool, at least in my early look, is image search. I've never been totally thrilled with Google's image search tool, which requires more work then I like before I can get to the actual picture.
There are just about as many steps involved in getting to the picture in the "Windows Live" image search tool, but it feels friendlier. First, you get a bunch of thumbnails (like Google) but I liked the jazzy way the thumbnails "pop" open and give you a bit more info when you mouse over them in Windows Live Search. Click an image and the UI changes again to include the page where it originally appeared.
Your original search results thumbnails are still visible, but re-located to the left side of the page. You can grab the original picture, sans web page, with the "show image" link at the top of the frame.
I found the "local" tab somewhat amusing when I typed in the word "Audio". Not knowing what to expect, I got a U.S. map and business address matches that were heavily concentrated in South Dakota. I know not why. Since I won't be shopping in Sioux Falls this week, I added an address to the search term and things improved a bit.
The feeds tab worked well. After signing in to my Microsoft Passport account, I was easily able to add a few new feeds to "My Stuff". It was a little hard figuring out how to get back to "My Stuff" after grabbing the feed (clicking the "add to live.com" with the green cross will do it, but it's not exactly obvious), but I did like the clean display in the My Stuff area.
The site went down last night, so I couldn't try out more of its features, but it shows some promise...but then almost anything is going to be an improvement over MSN search...
We'll have a lot more on this site in the coming days, but do tell us your first impressions...Here's a pretty good backgrounder on the beta.
UPDATE: They've crudded up the live.com with promo material, news crawls, and announcements. Bleh.
Wednesday, March 08, 2006
On Monday, at the VoiceCon 2006 conference in Orlando, Fla., Cisco said it will add support for session initiation protocol, or SIP, to its IP PBX software. The new version of the product, CallManager 5.0, will include SIP capabilities for Cisco IP phones, presence-awareness software and multimedia communications software.
SIP is used to establish contact between IP phones and to add special features--such as presence awareness, video or mobility capabilities--onto a voice over Internet Protocol (VoIP) network. The standard also makes it possible for companies deploying VoIP to mix and match the products they use, significantly lowering the cost of deploying a VoIP network.
Cisco had been the only major supplier in the market not to support SIP in its IP PBX software. Cisco sees the addition of SIP as an important step in being able to provide customers more features.
"IP telephony isn't just about toll bypass anymore," said Barry O'Sullivan, vice president of IP communications for Cisco. "It's about improving productivity and allowing people to do their jobs more effectively. And people need to be able to communicate and collaborate through the means that suits them best."
CallManager 5.0 should work with any SIP-based phone, but Cisco said specifically it plans to support a "softphone" (or PC-based phone) client for Research In Motion's BlackBerry handheld as well as Nokia's new dual-mode phones.
In addition to the upgraded CallManager, Cisco announced other new products including the Unified Presence Server, which collects status and availability data from users' devices and feeds it to Cisco applications, and the Unified Personal Communicator, which allows users to see on their PCs or IP phones who is online.
As part of the announcement this week, Cisco said it is working with Microsoft to integrate its Office Communicator 2005 and Office Live Communications with Cisco's Unified Communications System. The integration means that users can launch a VoIP conversation directly from their Microsoft Outlook client. The interoperable package should be available in August 2006, the companies said.
Monday, March 06, 2006
Ever since Google released its highly acclaimed GMail service, many users have found interesting ways to make the most of the available space provided by Google. While other free email services battle over megabytes of free space, Google currently leads all other services by the gigabytes. Using 3rd party utilities, it is possible to map your GMail account as a psuedo-drive in Windows and use the account as a drag-and-drop file system. With these tools, some users have even sent themselves invites to chain together accounts for an effectively unlimited amount of network storage space.
According to reports however, sometime in the near future this activity may no longer be limited to 3rd party utilities. On Google's analyst day, a document presented contained information about a possible service called GDrive. The details in the presentation indicate that Google's long term goal is to provide a service to users that give unlimited amounts of storage space so that any type of file can be uploaded and stored. The presentation even indicates the service may be built to allow users access to their files from any device, and any location. The Google presentation, before it was editted and removed by Google, read (emphasis ours):
Theme 2: Store 100% of User Data
With infinite storage, we can house all user files, including: emails, web history, pictures, bookmarks, etc and make it accessible from anywhere (any device, any platform, etc). We already have efforts in this direction in terms of GDrive, GDS, Lighthouse, but all of them face bandwidth and storage constraints today.
Naturally, privacy concerns are rising with regards to Google's goals of collecting information. In the presentation, Google even indicates that it plans to collect "all" of the world's information, not just some of it. In this regard, it could be possible for Google to provide high-level services for government bodies that wish to collect information in a manner that would otherwise be too difficult without Google's search spiders.
No information on whether or not Google also plans to offer these types of storage services for fee-based subscriptions, though Garett Rogers from ZDNet hypothesizes:
In some screenshots of Gmail for domains, it appears there are different "account plans" that I assume provide additional email addresses. Could a similar system work for online storage? For example, 1GB free and pay $5 for each additional.
Tuesday, February 28, 2006
Scheduled for release this year, Windows Vista will come in two versions for businesses, three for consumers and one for emerging markets.
The home versions will be called Home Basic, Home Premium and Ultimate, the company said.
'We live in a digital world that is filled with more information, more things to do and more ways to communicate with others than ever,' said Mike Sievert, corporate vice president of Windows Product Management and Marketing at Microsoft.
'The PC needs to give people the clarity and confidence to handle this `world of more` so they can focus on what`s most important to them. With our Windows Vista product line, we`ve streamlined and tailored our product lineup to provide what our customers want for today`s computing needs."
Tuesday, February 14, 2006
Monday, February 13, 2006
At the upcoming RSA Conference in two weeks, Cisco plans to debut major security products to help bolster its already strong security portfolio.
Security is categorized as one of the vendor's six Advanced Technologies and already brings in approximately US$2 billion per year in revenue, though routing and switching still account for more than 60 percent of Cisco's revenue.
The company has 1,500 engineers working solely on security products - VPN, firewall, intrusion-prevention, intrusion-detection systems (IPS/IDS) and other technologies. Hundreds more engineers work across its various infrastructure product lines to integrate security features into network gear.
Cisco is slated to announce upgrades to several of its key security products at the event (see more of what to expect at the show, page 8). An upgrade to its Adaptive Security Appliance (ASA) 5500, a VPN/firewall/IPS device, is due. Also on tap are upgrades to Cisco's Integrated Services Routers (ISR) and Monitoring Analysis and Response System (MARS) system, which orchestrates network infrastructure responses to virus/malware threats.
Cisco CEO John Chambers is one of the headliners at the show and is expected to push a theme of more tightly integrating security with infrastructure components.
"If you're going to provide security, Cisco's very uniquely positioned to do that," Chambers said in a recent interview.
Looking at the breadth of Cisco's security portfolio - and its market share in security products - Chambers' statement is hard to refute. The company leads in worldwide sales and shipments for most major security product categories, including VPN equipment and appliances, firewalls, and IPS and IDS, according to Infonetics Research. (But its total share in any of these markets is less than 40 percent; a vast difference from its core routing and switching markets, where it holds 70 percent to 80 percent market share).
Through a series of acquisitions over the last two years, Cisco has spent over a half-billion dollars enhancing its product portfolio to address security in almost every area of a network. It added traffic-anomaly detection with its Riverhead acquisition in 2004, as well as monitoring and client-scanning software from Protego and Perfego. The vendor has since turned these acquired technologies into products, or components of its Network Admission Control (NAC) architecture, which uses scanning technology to block malicious users via routers and switches.
"Security is not done in any one place" or product line, says Richard Palmer, vice president and general manager of Cisco's VPN and security business unit. "We focus on security not just as a set of technologies or functions that are done in one box, but more as a system."An example of Cisco's multi-product integration of security is its MARS product, which can interpret signals and alerts from IPS gear and react by sending policies to routers and switches. NAC technology is another example, Palmer says. Cisco even reaches into desktops with its Security Agent (part of NAC), which works with third-party anti-virus software and alerts a NAC-enabled infrastructure of potential threats on a client machine.
Cisco says all of these areas will fall under its latest plan for enterprise customers - Service-Oriented Network Architecture (SONA), announced in December. Under the SONA concept, security would be built into every piece of a network infrastructure and would be delivered as a service along with applications, voice and mobility.
Cisco is not alone in chasing the billions of dollars of potential revenue in the market for securing enterprise network infrastructure and applications. Most of Cisco's switch/router competitors - Alcatel, 3Com, HP, Enterasys and Nortel - have products similar to Cisco's NAC and MARS offerings.
Meanwhile, start-ups are defining the next generation of Web application firewalls, which protect SOA applications from attack and misuse. Vendors such as NetContinuum, Magnifier (bought by F5) and Teros (purchased by Citrix) offer application-layer security features not yet in Cisco's portfolio.
Network access control vendors EdgeWall, Lockdown Networks, Mirage Networks, Nevis Networks and Vernier are entering the market as Cisco slowly joins the Layer 2 switch network access control market, which it helped create.
Before Cisco gets too far into next-generation security technology, some users of its products say there's plenty to improve upon in its current lines.
"I'm leery of any vendor that says they have the do-everything security solution," says Scott Pinkerton, network services manager at Argonne National Laboratory, a U.S. Department of Energy research center operated by the University of Chicago. "Every organization is so nuanced and different that one-size-fits-all is really hard to do with security. No security solution is easy. . . . They all require more tuning than you'd ever like."
Even with this philosophy, Argonne uses Cisco security gear, from its VPN 3000 concentrator to its PIX firewall and IPS/IDS equipment.
Three areas in which Cisco security gear needs to improve are "integration, integration, integration," Pinkerton says jokingly.
The network staff at Argonne uses a mix of custom scripting, some management tools from Cisco and other software to tie together Cisco firewalls and IDS sensors, allowing Pinkerton to dynamically reconfigure firewall policies when threats are detected. "Today we do that ourselves, but Cisco's security products do not," he says. "Why is that?"
While Cisco tries to make advances on the security products front, it is kept busy by the growing number of reported hackable flaws and vulnerabilities in the very security products it pitches.
The company has released eight new or updated product security advisories so far in 2006, affecting products ranging from its VPN 3000 and MARS to VOIP gear and IOS software.
"There's no vendor out there that's perfect" in terms of product vulnerabilities, says Zeus Kerravala, an analyst with The Yankee Group. "But while Cisco's strength is their installed base, it's their weakness regarding vulnerabilities. "There are far more people that are going to try and hack into a Cisco router than" other network products.
Cisco's Palmer says the company's top priority is to better secure the devices it sells to safeguard customer networks.
Each Cisco product group shares best practices for writing secure code and building hardware that is harder to hack, Palmer says. "We're looking at this in terms of vulnerabilities, in terms of requiring authentication on multiple levels and in terms of securing the control plane along with the [regular] traffic."
Making it easier for users to quickly change, patch or fix flawed gear is another area in which Cisco could improve. "Cisco also needs to do a better job of educating customers on best practices for security on their devices," Kerravala says. "They have to come up with better configuration management tools and best practices to make sure that vulnerabilities are minimized."
He says Cisco has made some strides in making its products more systemic.
"Cisco's whole security product portfolio is made up of a bunch of acquisitions," he says. In that sense, buying Cisco VPN, IPS and firewall gear was more like buying products from three different vendors instead of a single security solution or system.
"The value Cisco can add is to put some kind of management framework on top of it and make it look like a system," Kerravala says. "That's where they put a lot of effort, and where they should put a lot of effort."
"In the emerging areas - such as SSL and IPS - Cisco is never going to be the industry trendsetter," he says. "You've got small dedicated start-ups with an entire company doing nothing but these technologies. Cisco can't maintain product leadership across all categories in all moments in time."
Products from pure-security vendors such as Arbor Networks, Check Point, Cybershield, Internet Security Systems and Sourcefire are still held in higher esteem by some network security aficionados and experts than infrastructure-based offerings from Cisco and its ilk.
Part of the reason Cisco will never dominate security the way it does routing and switching is that security technology is constantly evolving, observers say.
"Cisco is very strong where they have account control and where they have a lot of network equipment," says John Oltsik, an analyst with Enterprise Strategy Group. "Where Cisco's influence is weaker is in any organization where the security department is more dominant in selecting products."
Here, security "pure-play" vendors are more likely to get as much time and consideration as Cisco, as opposed to enterprise network groups that use Cisco gear, and may not look at competitive routers and switches often, Oltsik adds.