Log in



Recent Posts

Recent Comments


    Layer 3 Switches: Still The Gold Standard

    December 21st, 2014 by

    Layer 3 switches can slash router latency, but the current lack of unified industry standards necessitates cautious deployment of the technology

    Reducing router latency is one of the top priorities of many network managers, so the industry’s crazed rush to deliver Layer 3 switching technology is not surprising. Unfortunately, little agreement has been reached on a unified Layer 3 switch design, and it doesn’t appear that standards are looming in the wings.

    Because of this lack of uniformity, PC Week Labs urges network administrators to take only cautious steps toward implementing this new technology. Those desperate to deploy Layer 3 switches should start from the outer edges of the network, rather than overhauling the backbone with products that might not fully interoperate with installed infrastructures or future standards.

    Detouring around congestion

    Layer 3 switch technology can substantially reduce router latency and turbocharge communications. A logical extension to LAN switching, Layer 3 switching delivers routing capabilities–which normally occur at the OSI Network layer–at the lower latencies associated with OSI Layer 2 (the Data Link layer) switches.

    When network congestion in shared-bandwidth LANs prompted corporations to adopt LAN switching, the conventional routers in backbones and the outer edges of networks remained unchanged. Meanwhile, because routers simply cannot keep pace with switches, network traffic increased, causing a new bottleneck. Thus, enter Layer 3 switching.

    Switches and routers operate at different OSI layers. Switches function in Layer 2, whereas routers operate at Layer 3 (the Network layer). Responsible for providing a packet-transfer service, Layer 3 manages a variety of information within a protocol stack, such as TCP/IP. Layer 2 switches, in contrast, are simple frame-forwarding devices and only use the MAC (media access control) addresses to choose a frame’s destination, so they require less overhead to read and manipulate each packet–switches that operate at Layer 2 are much faster than Layer 3-based routing. Because they include Layer 3 routing intelligence in the Layer 2 switch’s hardware, Layer 3 switches eliminate the usual bottleneck associated with routing between subnets.

    In the context of VLANs (virtual LANs), Layer 3 switching allows network managers to define broadcast domains within a switched LAN using specific protocol criteria, such as an IP subnet or an IPX network number. Other forms of creating VLANs include port-based and MAC address-based approaches. Although VLAN specs are defined by IEEE (Institute of Electrical and Electronics Engineers) working group 802.1Q, the IEEE 802 committee is only concerned with the first two layers of the OSI stack, so the Layer 3 VLANs remain proprietary. Therefore, 802.1Q-compliant products follow specifications only on port-based and MAC address-based VLANs.

    What’s on the shelves so far?

    A number of vendors have shipped Layer 3 switches or are planning to do so in the near future.

    But not only are their designs proprietary, there’s not even a standard naming convention: Vendors use terms such as “routing switches,” “multilayer switches” and “Layer 3 switches.” To add to the confusion, vendors are taking the liberty of describing their switches as everything from network-layer-based VLANs to full IP and IPX super routers.

    The lack of industry standards and seemingly subtle design variations has translated into vast differences in product functionality.

    Among the vendors offering Layer 3 switches are Xylan Corp., Bay Networks Inc., Madge Networks Inc. and Foundry Networks Inc. Most of these switches currently perform IP, IPX, or both IP and IPX routing.

    Some Layer 3 switches implement Layer 3 forwarding in software, which allows the switch to be updated to support whatever Layer 3 switching standards become implemented. The majority use ASICs (application-specific integrated circuits) to implement Layer 3 intelligence in the hardware, casting the switch’s Layer 3 implementation in (silicon-based) stone.

    Because most of the switches on the market today are based on proprietary ASICs, each switch design is inherently different, making interoperability among different Layer 3 switches impossible.

    In fact, some vendors achieve Layer 3 switching by replacing traditional routers and switches at both the core and outer edges of the network with their proprietary solutions. Others, such as Cisco Systems Inc., try to leverage their market share and steer the industry toward accepting their proprietary scheme as the definitive standard.

    Layer 3 switches need to interoperate with such network layer protocols as IP. Switches that incorporate standard routing protocols, such as RIP (Routing Information Protocol) and OSPF (Open Shortest Path First), can advertise themselves as routers to end stations and other routers in the network, and can therefore coexist within an infrastructure.

    This type of Layer 3 switch–especially when deployed at the outer edges of networks–can coexist with current routers and end stations, although interoperability with other Layer 3 switches from different vendors is not guaranteed.

    Network administrators must determine what type of Layer 3 switching–ranging from protocol-based VLAN definitions to full Layer 3 routing capabilities–they need to implement. Emerging standards, such as the 802.1Q specification for VLANs, must be closely monitored, and potential buyers should know whether hardware changes or software augmentations will be necessary to accommodate those upcoming standards.

    Perhaps most importantly, network managers should understand whether their current infrastructure devices need to be replaced or reconfigured to interoperate with the new Layer 3 switches.

    On the bright side, most Layer 3 switches can provide Layer 2 functionality. Thus, paying a few extra bucks per port (compared with Layer 2 switch prices) is not as risky as it sounds, because you can always use these boxes as traditional Layer 2 switches.

    Beyond the up-front cash outlay, a Layer 3 switch is inherently more complex to configure and manage than a traditional router–at least initially. Be prepared to make a large investment in equipment and personnel training.

    Although the corporate move toward IP as the main transport protocol coupled with the lure of reduced router latency promised by Layer 3 switch technology is tempting many network professionals to jump onto the Layer 3 switch bandwagon, we urge a cautious approach to current offerings: The importance of distinguishing facts from vendor hype could be the best approach for your network’s health.

    Desktop Videoconferencing: Once Huge, Now Skype’d Away

    December 4th, 2014 by

    Network managers, fearful that videoconferencing would consume excessive amounts of network bandwidth, have relegated this function to specialized-room systems with ISDN connections. But now, thanks to faster networks and the new H.323 videoconferencing interoperability standard, videoconferencing is coming to the desktop.

    Although desktop ISDN systems have been available for a while, IS managers have been reluctant to spend money to bring in additional ISDN lines. But things are changing: LANs are much faster, and LAN switching systems allow administrators to segment traffic, isolating videoconferencing data so that it has less impact on the network.

    In addition, the international videoconferencing standard H.323, which recently became available, is designed for use over packet networks such as LANs and the Internet. With H.323, packet-based backbones, such as the Internet or a private corporate intranet, can accommodate videoconferencing, eliminating the need for a specialized ISDN network.

    H.323’s architecture also allows network managers to control the impact of videoconferencing on corporate LANs. These gatekeeper functions include limiting the number of users and the amount of bandwidth an H.323 conference can consume.

    “We’re at the beginning of a fundamental change in the business of how working together happens,” said Bob Castle, president of VideoServer Inc., a Lexington, Mass., provider of multipoint conferencing systems.

    H.323 rolls out

    H.323-based conferencing systems already have begun rolling out to the consumer market, and Microsoft Corp. plans to bundle the standard into future releases of Windows 95 and Windows NT. Microsoft also is distributing a free H.323-compatible collaboration tool called NetMeeting. Most of the current H.323 tools are slated for use on the Internet, where H.323 will be useful because it’s designed to accommodate bandwidth as low as 28.8K bps.

    H.323 will be implemented in a variety of ways, with differing levels of performance. Software-based systems, where audio and video compression and decompression are performed by the PC microprocessor, will have relatively slow frame rates and small video screen sizes. For example, Microsoft’s NetMeeting displays up to 10 fps (frames per second) and a small video screen size known as QCIF (Quarter Common Intermediate Format), which has half the resolution of the full CIF. QCIF is 176 by 144 pixels; the higher-quality full CIF is 352 by 288 pixels.

    Specialized videoconferencing add-on hardware is required for larger screen sizes. “Hardware accelerators will help make desktop video with H.323 more attractive,” said Walt Jones, vice president of R&D at VideoServer. “NetMeeting is a good product, but the size of the video it displays is discouraging. A third party could add an adapter card that would use the great NetMeeting interface but increase the frame rate.”

    Such specialized hardware likely will be found in corporate-oriented H.323 systems that are slated to roll out during the next year. For example, PictureTel Corp. plans to introduce an H.323-compatible version of its LiveLAN videoconferencing system next month.

    PictureTel officials claim the H.323 system will rival the company’s ISDN-based systems for picture quality, delivering a full CIF screen at 15 fps. According to Peter Mahoney, director of marketing at PictureTel, the company’s system will allow users to adjust the quality of the conference by choosing their transmission speeds (ranging from 64K bps to 384K bps). Generally, the higher the bandwidth, the better the quality of video.

    PictureTel and other videoconferencing vendors expect corporations and other H.323 users to begin pilot testing the technology this year before rolling out larger implementations next year.

    Conferencing evolves

    Such evolutionary growth is necessary because all of the tools needed for corporate use of H.323 are not yet in place. Current H.323-compatible products, including NetMeeting and Intel Corp.’s ProShare, can handle only single, point-to-point conferences. For multipoint conferencing, products must connect to an MCU (multipoint control unit), which links several conference participants. MCU vendors, such as Videoserver and RADVision Ltd., now are developing H.323-compatible systems.

    Because of the hurdles still remaining (and their investment in existing equipment), many organizations likely will have both an ISDN-based system, which uses the H.320 videoconferencing standard, and new H.323 systems.

    MCU manufacturers are currently working on ways to link the two systems, which use different addressing schemes: H.320 systems use telephone numbers, and H.323 uses IP addresses. “We’ll try to adopt a standard authentication and naming service, but this is an area where services still have to be invented,” said VideoServer’s Jones, noting that H.323 also must be made to work with existing addressing services such as Domain Naming System and directories using the Lightweight Directory Access Protocol.

    In addition, gatekeeper functions will need to be enhanced to work with new Internet mechanisms for reserving bandwidth. These mechanisms, such as the Resource Reservation Protocol, are designed to reserve bandwidth for multimedia applications. The two standards will need to work together in order to avoid access problems.

    Videoconferencing systems with dedicated hardware can send larger images by tapping the Full Common Intermediate Format (left). Software-based H.323 systems transmit small images using the Quarter Common Intermediate Format (right).

    Sometimes, Avoiding The Popular Product Works

    November 29th, 2014 by

    When Paul Morgan began evaluating push technology products last January, he had one top priority: anything but PointCast. Like many a corporate Webmaster, Morgan had experienced firsthand the pitfalls of the wildly popular screen-saver-cum-automatic-newsfeed, including high bandwidth consumption, interference with user productivity applications and the distraction of ads. There had to be a better way.

    As Web site engineer for GTE Corp.’s Information Services Division, Morgan was looking for a way to push out project-management information to GTE’s staff in a real-time, noninvasive way. Project-management data is crucial for systems integrator GTE, which services government and private industry. Morgan wanted critical information such as a certain financial threshold being reached (for example, a project about to go over budget) or changes in project deliverables (such as a project about to go over deadline) in decision makers’ hands as soon as possible. Morgan also wanted the ability to have automatic desktop delivery of competitive information from external newsfeeds to marketing and analytic personnel.

    “We wanted something a little more emphatic than E-mail,” said Morgan. “Call it E-mail on steroids.”

    It made sense to distribute the information using one of the intriguing new push products on the division’s brand-new intranet, which was being rolled out to 700 users at the Chantilly, Va., site and to 350 off-site users. Although there were already a few hundred regular users of the PointCast Network at ISD, Morgan wanted to go elsewhere for push capabilities–with an eye toward eventually banning PointCast, or at least discouraging its use at the company.

    Morgan, along with his boss, Irv Zacks, vice president and general manager of ISD, was looking for something that would avoid the downfalls of PointCast and be much more closely tailored to distribute business information. “We didn’t want extraneous or distracting information,” Morgan said.

    PointCast’s use of advertisements were a particular bugaboo. “They distract users. And we didn’t want to dedicate corporate bandwidth to someone else’s message. It is simply desktop real estate that should be used for business applications,” Morgan added.

    “The ad-based model sets the stage for conflict between a company’s interests and PointCast’s interests. That is why a lot of the [push] companies targeting the intranet space are using different revenue models,” said Ross Scott Rubin, group director of Consumer Internet Technologies for Jupiter Communications Inc., in New York. “PointCast is under a lot of pressure to offer a different version with a modified client that would allow companies to filter the ads or do away with them altogether.”

    Examining the alternatives

    Zacks gave Morgan the directive that whatever product they chose must have links to external news sources to get competitive information–with no ads. Then he stepped back and let Morgan play the field.

    Morgan looked at BackWeb Technologies Inc.’s BackWeb, Intermind Corp.’s Intermind Communicator, Marimba Inc.’s Castanet and Wayfarer Communications Inc.’s Incisa. Morgan was intrigued by all these products, but he was most impressed with Incisa. An important point of differentiation was that the Wayfarer application appeared to be well-positioned to work with and augment both Netscape Communications Corp.’s Netcaster and Microsoft Corp.’s Active Desktop (both of which will contain push capabilities). Netcaster is due in the third quarter; delivery for Active Desktop has not yet been announced. Zacks wanted to be sure that GTE would not lose its “institutional investment” in implementing a push product only to have it supplanted by the Netscape and Microsoft offerings, he said.

    “Wayfarer was adroit at playing both camps, and that was a major difference between the products,” Morgan said. And at $25,000 for a 1,000-seat license, Incisa did not break the bank. The decision was made: Wayfarer Incisa was the one.

    During the evaluation period, Morgan also grudgingly evaluated–and even purchased–PointCast Inc.’s $995 I-server, which minimizes the amount of PointCast traffic entering the corporate firewall and also lets network administrators filter the types of content accessed by users. The I-server also can be used to broadcast business data over the intranet, but adding to a general anti-PointCast animus at the company, Morgan did not like the fact that there would only be one GTE ISD channel with all other channels controlled by PointCast. “Through gritted teeth, we bought the I-server to control the most outrageous [PointCast] use,” he said.

    Not PointCast, InfoCast

    Since the rollout began in March, 225 employees have been given access to Incisa, which GTE ISD has christened “InfoCast.” Users have the option of receiving information as a screensaver, a “headlink” (a headline with links to intranet stories) or a banner on the bottom of the screen. Morgan said his group recommends that users not elect the screensaver option because this will tie up the system for a few seconds before going back to the user application. This is one of Incisa’s few weaknesses, said Morgan, and it affects even systems running at 166MHz to 200MHz.

    GTE ISD users report that it is not annoying or obtrusive to receive headlinks because the information occupies a small space on the screen, only about 2.5 by 3.5 inches. Incisa includes a Shockwave animation sequence to get users’ attention. But, so far, this has remained a relatively staid, corporate type of animation, rather than an eye-popping cartoon. “We’ve been fairly conservative so far with the animations” in keeping with the goal of not unduly distracting users, Morgan said.

    Incisa pushes data from one of four sources: GTE ISD project-management databases; external news sources (such as Reuters, PR News Wire, Cnet and InfoSeek); “reporters” (individuals such as human resources managers who are empowered to push out news flashes); and “Web alerts” (a new capability through which users will receive only new material from selected Internet pages).

    Morgan and his users are anxiously awaiting the addition of several extra news sources, which are due soon. Wayfarer has been somewhat slow to add an InfoSeek search capability through which users can input a search and automatically receive updated results sent every time the results of the query change. “We are anxious to see how they’re going to incorporate search capabilities,” Morgan said.

    GTE network managers, for their part, are very happy with the minimal effect Incisa has on the network. “It is very gentle on network bandwidth. The notifications are quite small [a few kilobytes], even with the Shockwave animations,” said Morgan.

    Now at issue: the decision of whether or not to formally forbid PointCast. Morgan and Zacks are evaluating this issue and will come to a decision by the end of the month. “We’ll either come to a complete ban, or it will be culturally frowned upon,” Morgan said.

    Morgan is hoping the additional news feeds Wayfarer will be adding will be enough to satisfy users’ “CNN need,” he said. He isn’t too optimistic about eradicating users’ human thirst for nonbusiness information, however. Said Morgan, “There may be a residual batch of people who need to know about those bus turnovers in Peru.”

    IBM’s Support Tools Made The Brand

    November 14th, 2014 by

    IBM is readying a new suite of knowledge-based support tools for its corporate and retail customers that diagnose system problems, recommend solutions and provide troubleshooting via the Web.

    The Web-based tools, which will be announced within the next four to six weeks, are part of a broad plan to enhance IBM’s worldwide customer support and service infrastructure. The ultimate goal is to enable customers to more easily access support resources and resolve problems with IBM hardware, officials said last week during a briefing at the company’s HelpCenter technical support operations here.

    The tools, which will be bundled on the company’s notebooks, desktops and servers and offered via its Web site, use logic to identify a customer’s system problem and recommend appropriate solutions. If the fixes involve software, such as BIOS updates, drivers or patches, they can be automatically downloaded to the customer’s system.

    IBM’s moves are part of a trend among hardware and software vendors to reduce the time customers spend calling technical support lines. It’s a theme that resonates with many IT professionals.

    “This could probably eliminate 80 percent of the support calls,” said Fred Erickson, director of technology and automation for Phoenix-based Avnet Computer Marketing Group, a division of Avnet Inc. “Certainly, in our notebook deployment, [Web-based support tools] would be a great vehicle for our users when they can’t get into our help desk.”

    IBM’s new support tools will be tightly integrated with the HelpCenter’s Lotus Notes-based customer service database, called LENA (Leading Edge Network Application). The tools will tap into the database’s service history for each customer, along with a list of known problems associated with the customer’s system.

    For instance, when a customer logs in to the HelpCenter’s Web site, his or her system and its specifications are automatically identified by serial number. The customer then types in the suspected problem using a customized search engine, and the database generates a list of recommended solutions.

    “It takes it to the next stage where the system does the work for you,” said David Williams, vice president of marketing and support for IBM Personal Computer Co., in Somers, N.Y. “The tools help customers help themselves.”

    As part of its Web-based tools development, IBM will offer an enhanced version of its Update Connector software designed for corporate notebooks and desktop PCs. Update Connector, which currently is available only on certain retail desktop PCs and IBM’s new ThinkPad 385D notebooks, lets users connect to the HelpCenter Web site, query the center’s database for a list of software updates for their specific machines and then download the necessary drivers and files.

    IBM’s HelpCenter offers 7-by-24 support and has a 2,500-person staff that handles 600,000 calls per month at eight call centers worldwide. During the past two years, IBM has spent more than $250 million bolstering its worldwide support and service infrastructure as it seeks to lead the charge into new venues for support, such as the Web.

    Other PC makers offer varying levels of Web-based support. Dell Computer Corp., for example, offers customers more than 35,000 pages of troubleshooting information via its Web site, where users also can download software upgrades, BIOS updates and drivers free of charge. In addition, the Round Rock, Texas, company offers a customer “premier” page in which customers can access Dell service and support team members’ phone numbers, E-mail addresses and pager numbers.

    The Internet-based technical support services from Compaq Computer Corp., of Houston, enable users to post questions to support personnel, download software from an FTP site and access product and service information.

    IBM’s new Web-based support tools

    * New electronic support tools make it easier for users to access IBM’s HelpCenter Web site

    * Once at the site, the tools enable users to query the IBM LENA (Leading Edge Network Application) database for problems.

    * The tools diagnose problems and recommend solutions.

    * A corporate version of IBM’s Update Connector software will be preloaded in IBM commercial notebooks and desktops. The software lets users automatically download updated BIOS and drivers from the HelpCenter.

    Encryption Law Is A Tough Area

    November 5th, 2014 by

    Peter Browne is a casualty in the encryption wars.

    For the past year, Browne, senior vice president of information security for First Union Corp., has been unable to move aggressively on plans to implement an ambitious brokerage application on the bank’s Internet site. Why? Because, despite a red-hot battle between the White House on one side and encryption vendors and many Congress members on another, current law prohibits U.S. companies from “exporting” any product containing strong encryption. This means it would be illegal for $103 billion First Union, the nation’s sixth largest bank, to use strong (greater than 40-bit) encryption to encode financial transactions originating with customers outside the United States.

    “Strong encryption technology is absolutely critical for all the product and system plans we have,” says Browne, in Charlotte, N.C. “These restrictions have hampered us from expanding our higher-risk [Internet-based] transactions like brokerage.” Without the ability to use strong cryptography to encrypt messages going overseas, First Union would have to pass up the opportunity to expand its electronic commerce applications.

    Every company hanging out a shingle on the Internet is courting customers outside the United States–even if the company doesn’t officially bill itself as “global.” This means millions of companies–both within and outside the financial industry–are having to delay or even scrap plans to expand electronic commerce initiatives because of the restrictions on deploying products containing strong encryption to their foreign subsidiaries. (Companies can apply for a license to deploy or “export” products containing encryption technology to their offices abroad, but the process may be time-consuming and is easier in some industries, such as finance, than others.)

    Although battle-scarred as of a few weeks ago, Browne is no longer a victim in the fight over encryption-policy reform. On May 8, William Reinsch, Undersecretary of the Bureau of Export Administration for the Department of Commerce, issued a press release saying the DOC would allow the export of the strongest available data encryption products to support global electronic commerce–specifically for financial transactions.

    That’s certainly good news for First Union, because the newly relaxed policy applies directly to the bank’s plans for an Internet brokerage application. But while Browne moves ahead, he remains skeptical of the Reinsch initiative, because it’s not a full-blown regulation. Says Browne, “The restrictions are allegedly lifted for financial transactions. But the regulations haven’t even been drafted yet,” so companies won’t know the details on the new policy for some time.

    For another thing, the new regulations will apply only to commercial transactions. “The new relaxation on the rules of export is only for electronic commerce. For us, it’s very helpful. For some other multinational corporations, they’re still hamstrung,” says Browne.

    Browne is far from the only one to feel the squeeze of the encryption laws. Merrill Lynch has shelved the international version of its Merrill Lynch Online financial service, which has been a tremendous success with more than 150,000 subscribers in the first six months of the offering. “We decided this would be a domestic-only offering until the laws change,” says Randal Langdon, director of interactive sales technologies for Merrill Lynch, in Princeton, N.J.

    That is, unless or until the SAFE (Security and Freedom through Encryption) Act and the Pro-CODE (the Promotion of Commerce Online in the Digital Era) Bill–now making their way through the legislative process–become law. These bills would substantially relax the current encryption restrictions and give companies more freedom of choice in choosing the type of encryption they prefer.

    SAFE and Pro-CODE are at the heart of another aspect of the encryption fight: the Clinton administration’s mandate that companies provide the government with keys to “unlock” any data that is encoded with strong encryption so it could be quickly deciphered if a crime were committed. Encryption vendors and privacy advocates remain vehemently opposed to any scheme that makes the government a third-party trustee for the keys to encrypted data.

    News of the coming relaxation on encryption exports comes amid further advances in the fight to reform current law. Introduced in March, the Clinton administration’s Electronic Data Security Act of 1997–which proposed government access to encrypted data through a key-escrow program–fizzled out in April after failing to get any support.

    Then, on May 14, the House Judiciary Committee unanimously passed the SAFE Act, which would eliminate the current export restrictions on products containing strong (i.e., greater than 40-bit) encryption. The act now moves to the House International Relations Committee for debate before continuing through the legislative process.

    Even CIOs not drawn to the vagaries of legislative reform will be interested in the upcoming changes that will make it easier to protect E-mail messages and deploy Web browsers containing strong encryption–both priority security concerns for most in corporate IT.

    “The No. 1 concern is E-mail. Nothing else is even close,” says Jim Bidzos, president of security vendor RSA Data Security Inc., in Redwood City, Calif. “Their employees are communicating with foreign branch offices, suppliers, etc. The vulnerability is there.”

    The new S/MIME (Secure Multipurpose Internet Mail Extension) standard will extend encryption and authentication capabilities, giving the equivalent of signatures and envelopes to E-mail. Netscape Communications Corp. and Microsoft Corp. have announced plans to support S/MIME in the next versions of their browsers, due in the fall. Companies also will be able to deploy the next versions of Netscape and Microsoft browsers containing 56-bit encryption overseas, due to special approval from the DOC –a sign that things are definitely loosening up on the encryption front.

    Yet observers still question why White House attempts to address the issue of strong encryption still fall short. The White House position on encryption is suspiciously similar to the administration’s doomed 1993 Clipper Chip proposal. The Clipper Chip would have guaranteed government “trap door” access to encrypted data. “We call this Clipper 4.2.1,” says Kelly Hubner Blough, director of government relations for Pretty Good Privacy Inc., an encryption vendor in San Mateo, Calif.

    Privacy advocates are likewise baffled by the obstinacy of the government position against strong encryption. “The administration seems to go through bouts of policy masochism where they resurrect a policy like Clipper that everyone said was a brain-dead idea and dress it up and say, ‘Isn’t this fine?'” says Marc Rotenberg, director of the Electronic Privacy Information Center, a Washington public interest research group.

    Protecting national security and law enforcement are the two major concerns cited by the Clinton administration in justifying the need for tight controls on encryption exports. “We have a gigantic machine, called the NSA [National Security Agency], that collects information all around the world. They want to keep that machine running for as long as they can,” says RSA’s Bidzos.

    Although things are unsettled on the legislative front, one thing is clear: Something will happen, soon, to change the status quo of U.S. encryption law.

    Keep On Pushin’

    October 16th, 2014 by

    Gregg Petch’s first implementation of push technology was a real pushover. As CIO of the Metropolitan Regional Information System, one of the nation’s largest multiple listing services for the real estate industry, he needed a simple application to send IS updates to the help desk. Petch figured he’d need about two weeks to install and work out any kinks in Lanacom Inc.’s Headliner product. Surprisingly, the rollout took just two days.

    Now, the head of IT is tackling a more formidable challenge–tying together BackWeb Technologies Inc.’s push technology with an Oracle Corp. Web server and a custom browser to deliver real estate information to MRIS’ 30,000 member agents. Among Petch’s many tasks: using Visual Basic to write custom SQL statements to an Oracle database from BackWeb’s search engine. You’ll forgive Petch if he begs off providing an exact date for the completion of this summer project.

    Petch’s efforts will no doubt be watched closely by his fellow ITers. As push moves from beta sites into the real world, CIOs are looking for concrete results to determine if they should join the game. “It’s wait and see what the other guy does,” says Burt Weatherford, a network manager for Applied Materials Inc., of Santa Clara, Calif., which is in the early stages of using McAfee Associates Inc.’s SecureCast to push anti-virus updates to user desktops. But those on the sidelines won’t have to wait long. Several companies, from Ascend Communications Inc. to Toronto Dominion Bank, have recently completed or are putting the finishing touches on push implementations.

    Some 12 percent of companies with intranets have already deployed push, according to a new survey by Zona Research and IntelliQuest.

    However, as with any new technology, the people paying for push have yet to venture beyond rudimentary applications. Most are focusing on improving communications with employees, customers and business partners. Another group aims to boost productivity–from low-paid help desk workers to highly compensated sales reps and financial advisers.

    These early pioneers say push technologies have generally met their expectations. Many boast that they spent less money and time on implementations than anticipated. But at least one anonymous beta tester says he needs to see more robust technologies before he’ll open his wallet. And those doing mission-critical applications aren’t even considering push technologies from the gaggle of widely hyped startups. They’re sticking with proven vendors such as 12-year-old The Information Bus Co. Inc., of Palo Alto, Calif., which developed its own version of the technology long before “push” became a part of the popular lexicon.

    At your service

    Improving customer service was the primary catalyst for seeking out push at several companies, including MRIS and Epson America Inc. Epson, a Torrance, Calif., printer maker, wanted a simple application to notify customers of changes on its Web site, such as new product announcements, awards, driver updates and sales promotions.

    Epson replaced an outdated listserv application with Intermind Corp.’s Global Publisher tool and Intermind Communicator client software. The makeover was relatively painless. Installing Intermind took about an hour: Webmaster Alex Nathanson loaded the 4MB Global Publisher on a 75MHz Pentium PC running Windows 95, without installing any software on his Web server. To get updates from Epson, users simply download free Intermind client software. (They fill out a profile saying what sort of info they want, and Intermind shoots back brief notes with hot links to the Epson Web site.) Nathanson declined to reveal the cost for his particular installation, but Intermind, of Seattle, charges $1,500 for a Global Publisher license for fewer than 500 subscribers per channel and $10,000 for more than 500 subscribers.

    Nathanson couldn’t be happier with the results. Posting updates now takes about 5 minutes, as opposed to several hours or days with the listserv. About 2,000 users have signed up for the service, which went live in March. The hit rate on Epson’s home page (www.epson.com) is about the same, but the number of visits is increasing, suggesting that people are using the site more efficiently, Nathanson says.

    MRIS needed both a simple and robust push solution, so it bought two–each addressing the support needs of different constituencies. For its 55-person help desk, the listing service purchased Lanacom’s Headliner, to help make the desk, which handles about 38,000 calls a month, more efficient and responsive. If one of MRIS’ Internet POPs (points of presence) goes down, the desk is flooded with calls from agents who have been booted off the online service. Prior to Headliner, a help desk rep couldn’t give a disgruntled caller any help, because he or she rarely knew what the problem was. Now, the reps see a streaming ticker on their PCs that alerts them to a technical problem with a POP and tells where to point the affected user.

    This summer, MRIS plans to push out the more complex project. Using BackWeb’s eponymous push technology, member real estate agents will be able to log on to MRIS and create profiles for each of their customers. The system will then query a database of prospective homes and push those that fit buyer profiles to the real estate agent. It gets even cooler: Since real estate agents tapping into MRIS are using standard Web browsers, BackWeb will push back HTML documents, photos, even videos, says Eric Beser, vice president of engineering for Targeted Multimedia Inc., an Owings Mills, Md., VAR working on the project.

    MRIS CEO Dale Ross figures the BackWeb deployment will cost “hundreds of thousands of dollars.” But it’s worth it, because it will make MRIS members, who pay $35 per month for the online service, more efficient. “Realtors don’t have the time to go and look for technology,” Ross says. “It’s up to us to look at the cutting edge.”

    If presenting a $200,000 proposal to your CEO makes your stomach sour, consider taking another tack: Pitch push as a cost-cutter. Consider Ascend’s implementation of Diffusion Inc.’s IntraExpress. Ascend expects the push product to allow it to eliminate most mass mailings to VARs, an expense that averages about $35,000 for a new product rollout. Les Sparrey, director of VAR channel marketing for the Alameda, Calif., company, figures IntraExpress will pay for itself in less than a year.

    Ironically, Sparrey was not drawn to push because of cost savings. He was more concerned about strong anecdotal evidence that Ascend’s 100 VARs–and their 1,300 sales and support reps–were not reading reams of expensive mailings about new products. So he went in search of a product that delivered the same information to VARs in any form they wanted, from E-mail to faxes to pager notifications. IntraExpress was the only product he found that could meet all those requirements.

    Apps on demand

    Only a small number of companies are actually pushing applications to desktops, although that’s one of the most touted features of push, particularly Marimba Inc.’s Castanet. One of the few companies willing to talk about its use of Castanet is Wired Digital Inc. (formerly Hotwired), the San Francisco online arm of Wired magazine. Ed Anuff, director of product management, says he was driven to purchase a $25,000 Castanet Transmitter because Wired Digital’s Talk.Com Java application turned out to be more of a resource hog than anticipated. So Wired Digital created a smaller Java version of the service and a more robust Java app that is downloaded through a Castanet channel and viewed with a free Castanet Tuner.

    The beauty of Castanet is that users simply download the app once–instead of launching it every time they sign on. More importantly, Wired Digital can transparently improve the application, making bug fixes or other changes, without bothering users.

    “Users don’t have to worry about it, and that’s very important because of the challenges of Internet time,” Anuff says.

    Push startups may be getting their fair share of brand-name customers for basic applications, but few have yet to land a mission-critical implementation. Users in that arena just can’t afford to take any risk. That’s the case with TD Securities Inc., of Toronto, which is in the final stages of a project to push select data out to its traders. TD had outgrown its 5-year-old system, which lacked a customizable user interface and didn’t update pages as quickly as newer technologies. “When markets begin to move quickly, you could potentially miss a trade or mis-price a trade if there is a delay of 10 to 15 seconds,” says Steve Tennyson, director of systems and technology for TD, a subsidiary of Toronto Dominion Bank of Canada.

    When TD went shopping for solutions, it only looked at companies with a track record in financial services. “It has to be a proven technology because it has to scale for a large number of users,” says Tennyson. “And, because it is mission-critical, it has to have a strong support organization behind it.”

    It was no small factor that the product TD chose, TIBCO’s Market Sheet for Windows, was already being used by four of Canada’s five big banks. TIBCO, a division of Reuters Holdings plc, makes software that has been incorporated into push offerings from about a dozen developers, including BackWeb, Diffusion and IBM’s Lotus Development Corp. division.

    With TIBCO’s TibLink software, market data and stock quotes from Reuters and Tele-Rate are piped into TD’s servers and distributed to 350 Pentium desktops, where each user sets up a custom view with Market Sheet for Windows. For example, a trader can create a graph for the U.S. dollar that dynamically changes as currency rate data streams in.

    Ultimately, Tennyson expects the system will save TD money because its improved user interface and faster updates will increase productivity, he says.

    The skeptics

    Despite all the hype, not everyone is wild about push. “It’s just not on the lips of my clients, who are among the top 20 [financial, insurance and energy] companies in the Fortune 500,” says Chris Dallas-Feeney, a partner in the strategic technologies group for Booz-Allen & Hamilton Inc., in New York. “They’re talking about doing secure financial transactions, public key encryption, getting customers access to relevant information through extranets.”

    Others say the technology needs to mature. Mitch Hadley, vice president of strategic technology for Nations Bank, in Charlotte, N.C., ran a pilot for Wayfarer Communications Inc.’s Incisa product for about four months but closed the project last month. Incisa and the other push products on the market don’t have a fine-enough filter to make them truly useful, Hadley contends.

    You’ll hear the same from Hewlett-Packard Co. CIO Robert Walker, who commands a 100,000-desktop intranet. His IT organization is looking at various push technologies but has no plans for a wide-scale deployment. “The challenge you have is trying to push information to knowledge workers,” Walker says. “How do you figure out exactly what they need at one point in time?” The only solution he sees at the moment is to create a staff of “editors” who can act as a secondary filter to refine the content pushed to knowledge workers.

    With so many push implementations just getting off the ground, it will likely take time to get substantial feedback from users. Only then will we know if push is really a “pushover” or yet another technology that promised more than it could deliver.

    SAP Is Too Big? A Debate

    September 17th, 2014 by

    While serving as a standard business platform, SAP designed R/3 to be adaptable to customers’ changing business needs. Two good examples of this are the hundreds of complementary software solutions created by SAP’s partners, and the creation of R/3 3.1–SAP’s Internet-enabled version of R/3–in less than nine months.

    Taschek’s statement about SAP sales cycles assumes that selecting R/3 is as easy as selecting a spreadsheet at a local computer store. In fact, SAP sales cycles often require a significant length of time. Because R/3 is a business-critical application, all of our customers rely on its stability and data integrity.

    The process of selecting and implementing R/3 varies widely. Dozens of companies have brought R/3 “live” in as little as four months using SAP’s rapid-implementation methodology. In January 1997, 60 percent of the 750 companies that went live with R/3 did it in 10 months or less.

    Taschek’s inference that R/3 is proprietary is flatly incorrect: R/3 has always incorporated key industry standards.

    Lastly, rather than being a technical interface specification, SAP’s Business APIs serve as conduits for exchanging business logic and information between R/3 and complementary applications. In reality, SAP’s BAPIs offer far greater long-term stability and simplicity than competitive solutions.

    Paul Wahl, president SAP America Inc., Wayne, Pa.

    I am answering John Taschek’s call for satisfied SAP users. I could not disagree more with Taschek’s assessment of SAP. We experienced none of the nightmarish aspects of implementing SAP that were described in the article.

    We began the sales process with SAP in January 1995, closed the deal the first week of March, began implementation in mid-April, and went live with three substantial functional modules of SAP in seven manufacturing facilities four and a half months later–on Sept. 7, 1995. The consulting costs were less that $300,000.

    Pacific Coast Feather Co. has realized significant payback from our implementation of SAP. We have reduced order turnaround time and improved customer service levels, significantly improved our inventory control and improved efficiency in accounting by reducing tasks that took weeks into tasks that now require minutes. We are now, thanks to SAP, able to provide real-time information to management in a flexible, meaningful way which contributes to strategic decision making.

    I have worked with business application software for 20 years and I think SAP is the best thing to happen to business systems since the computer. SAP has created business applications which, for the first time ever, are able to take advantage of state-of-the-art technology. They have eliminated the gap between technological possibilities and business application realities.

    I have no concerns about their ability to stay ahead of the pack with regard to conducting business over the Internet.

    Gwen Babcock, MIS director Pacific Coast Feather Co., Seattle

    Microsoft? scared?

    I see the options a little differently than Rob O’Regan (“A Grumpy Microsoft Goes in Search of a Caffeine Fix,” April 28).

    Option 1: “Microsoft can continue to parry with JavaSoft over control of the Java platform. But that’s a battle Microsoft is destined to lose …”

    I expect more parrying and I doubt very much that Microsoft will lose. Microsoft has an excellent Java product in Visual J++. Symantec has an excellent product in Visual Café. Although Sun’s Java Workstation is a great product by comparison to other Unix tools, it is still way behind these other products.

    No matter how fast Java goes to a standards committee, the market will be ahead of the standard. Delaying the standard buys time for Sun to change the core specs, but also opens the door for the market to move away from them.

    Option 2: “It can move whole-hog to Java, rewriting the core Windows technologies to support that platform.”

    Java is not nearly as powerful as C++, nor is it anywhere near as fast. Java offers nothing for the Windows OS. It can be used for application development and as it matures it could challenge C++. More likely, it will be another tool in a good developer’s toolbox.

    Option 3: “Microsoft will try to convince customers that the Internet is one part of the enterprise, not vice-versa, and that to be successful in the enterprise, companies need a full-blown operating platform that includes Windows 95 on the desktop and NT on the back end. This is a risky proposition.”

    This is not a risky proposition. The real story here is not that Java will replace Windows as an OS. (It isn’t, so it won’t.)

    As for Bill and Co. being up at night, they’re not worrying that Java will displace Windows. Rather, someone had a good idea before they did and they are trying to make as much as they can on what’s left.

    Airlines Took Too Long To Give Power To Passengers

    August 18th, 2014 by

    With vendors of notebook PCs unable to deliver systems with enough battery life for long flights, airlines are getting set to launch a system that provides power to passengers in their seats.

    The new power distribution system, which will be rolled out in airline fleets later this year, will tap power from generators in a plane’s engines and distribute 15 volts of DC power to outlets located in seat armrests. This should be a boon to travelers with notebook PCs who would like to make flying time more productive.

    However, the initial deployment will only be in first-class and business-class seats on aircraft making international and long-range flights, and those who want to use the outlets will need to buy a special power adapter for their notebooks.

    For the airlines, safety is the primary consideration, so the system is designed to provide convenience to passengers without robbing other plane systems of power. Primex Aerospace Co., of Redmond, Wash., has designed one of the first systems, the EmPower, which consists of three major components: an MCU (master control unit), an ISPS (in-seat power supply) and outlets (see diagram).

    The MCU manages the power that comes from generators in the plane’s engines, distributing it to as many as 120 seats per MCU. The device monitors the system for faults and system load, as well as controlling the ISPS. The flight crew can turn off power to the MCU when necessary, including during takeoff and landing.

    Ideally, each seat on a plane would have outlets that could be used as needed. However, because there is a limit on the amount of current available to the system, a load-limit select module is used to set that limit, and then the MCU monitors how much power is being used.

    In normal operation mode, all outlets are enabled but not all are in use. As more passengers plug in notebooks, the system reaches the load limit. At this point, the MCU turns off the ISPS units not already in use. The master control unit continues to monitor the load until it is 10 percent under the maximum, at which point the remaining in-seat power supply units become available for use.

    Although the engine generators supply AC power by default and notebook power supplies tap AC power, the airlines specified DC power at the seats for safety reasons.

    Each ISPS unit converts the power supplied by the MCU from AC to DC and provides another level of fault tolerance to the system. These units monitor for faults and over/under voltage situations.

    To conserve power, the in-seat power supply operates in standby mode until a notebook is plugged into an outlet. Once this happens, the ISPS unit begins supplying power, unless the notebook causes a fault in the system, in which case it cuts off power to the outlet.

    Let there be light

    Passengers can tell when in-seat power is available by looking at an LED next to the two outlets. The LED illuminates at half brightness to indicate that power is available and is fully lit when power is being supplied to a notebook.

    Because of safety considerations, no power is supplied to either outlet at a passenger’s seat unless a cable is plugged in. The power adapters fit snugly enough to keep from being accidentally unplugged and to keep spilt liquid from getting into the outlet. The second outlet could be used either to charge batteries or to power a second electronic device, such as a handheld video player.

    A formal standard for the connector type that will work in these outlets has not been ratified by the airline industry’s standards board, but of the nine airlines that have announced plans to deploy such a system, only American Airlines has proposed using a unique jack. The first airline to announce plans for in-seat power, American suggested using an automobile-type cigarette lighter jack because of its prevalence.

    The proposed connector standard, ARINC-628, will likely be ratified later this year and uses a smaller connector than the one American Airlines has proposed. This connector would fit in the EmPower outlets.

    For notebook users interested in buying connectors for the EmPower outlets, power adapter supplier Xtend Micro Products Inc., of Irvine, Calif., has announced adapters that will work with the EmPower system.

    Government Gains Efficiency With Strong Intranet Design

    July 19th, 2014 by

    With the PC explosion in the late ’80s and early ’90s, the California Department of Education encountered a problem faced by many companies and organizations: Although PCs empowered end users, the spread of information resources to the desktop hindered the department from getting an enterprisewide view of its data.

    As a result, state legislators had trouble finding the financial, demographic and performance data about students that is necessary for making informed policy decisions. That situation didn’t change until the department adopted an Internet/intranet solution to easily access this complex mass of information.

    “Certainly, companies want to set a single standard [for handling data], and intranets are a practical method,” said Matt Nerney, an analyst at Aberdeen Group, a consultancy in Boston. “It’s not as difficult to get people to move in that direction, because they see the benefits of an intranet.”

    Information sprawl

    As you might expect for such a sprawling state, the California Department of Education oversees a vast amount of information. The department disperses more than $20 billion a year in funding. The state’s complex educational system comprises some 1,000 school districts, each legally autonomous with its own governing board. The situation is further complicated by numerous JPAs (Joint Powers Agreements), in which districts band together for specific programs and special education projects.

    “We’re not like the Florida school system, which is very hierarchical,” said Brian Uslan, manager of client system services for the department’s information systems and services.

    Although department funds were always allocated properly, Uslan admitted that “basic types of questions couldn’t be answered.” The department knew how much state money was going into individual programs, but the lack of a consolidated data source prevented it from determining total state funding to individual school districts. Indeed, in 1994, during one of the emotionally charged discussions of education that are common in the Golden State, an editorial in The Los Angeles Times griped that “expenditure data is not readily available” on educational spending.

    The department made its first stab at climbing out of the data quagmire in the early 1990s when it adopted a product called Metaphor from Metaphor Computer Systems. Metaphor combined a GUI with relational database technology, allowing nontechnical users to access multiple databases and construct their own applications. At a time when 16-bit operating systems were commonplace, Metaphor operated with its own 32-bit system.

    “Initially, Metaphor was a wonderful product for modeling data, providing quick interpretations, and it was very user-friendly,” Uslan said.

    Metaphor was purchased by IBM, which moved the product over to its own 32-bit operating system, OS/2. The machines were powerful, Uslan said, providing integrated spreadsheets, word processing and back-end connections to databases. But there was a problem: OS/2 was difficult to integrate into a network environment–especially on the desktop, where people were using Windows 3.1 at the time, Uslan noted. As a departmentwide platform, it just didn’t work in the new PC-focused world.

    “The PC has added a tremendous amount of power and autonomy, but the downside is understanding information resources from an enterprisewide mind set,” Uslan said. “We didn’t transition in a coordinated manner from the mainframe, and it’s taking a lot of work to bring the department back into a cohesive environment.”

    The problems continued for a couple of years until IBM developed a Web kit for Metaphor, which had evolved into a product called IDS, or the Intelligent Data Server. Standardizing the power of the Internet made an obvious solution for getting a handle on the department’s data. Because the department is a government organization, most of the information must be made available to the public.

    The decision was made to go with a dual Internet/intranet approach. “As we moved ahead with our internal project, another nonprofit, nonpartisan organization called EdSource was looking for ways to bring extensive data to the public in a context that had meaning,” Uslan explained. The department’s database, along with the Internet technology, provided a user-friendly interface to accomplish that.

    Many of the local school districts around the state now access information through the resulting public Web site, at 165.74.253.5/available.html. The site is integrated into a broader site that provides background and context to this educational data. The Ed-Data Partnership–a consortium of school districts, county offices, EdSource and the department–developed this site at www.ed-data.k12.ca.us//that provides the point of entry to the IDS site.

    The department’s Web site is designed to allow users to make standardized queries about expenditures per pupil, enrollment information and student demographics. A popular feature permits users to compare similar school districts based on specific criteria. “Before this, a district supervisor could often only find out some of this stuff if [he or she] happened to run into a supervisor of another district at a conference,” Uslan said.

    The standardized queries represent more than 80 percent of the general funding questions the department of education receives from the field. “The system develops queries on the fly, so we don’t have to maintain huge files,” Uslan said. “We’re very excited and want to do this further by having more modeling flexibility within the environment.”

    State laws detail strict guidelines about what information can be revealed–for example, a 200-student school district with a couple of Hispanic students could not break out information by ethnicity. No confidential data is on the data server. By having both an Internet and intranet, outside people are prohibited from accessing data that hasn’t been audited or from seeing private data about individual students.

    If someone wants additional information not covered by the standardized queries, a request can be made for the department to run the query through the intranet. Currently, those queries are run by two employees; soon that searching ability will be expanded to all desktops within the department.

    On initial proposals, the IDS system is used without the Web server. This allows access to the same data sources but provides the confidentiality needed to evaluate potentially sensitive proposals–such as weighing the benefits of adding a cost of living allowance as opposed to putting more funds into a specific program. “These are proposals where there are winners and losers,” Uslan said.

    Those kinds of issues are touchy and the intranet prevents the preliminary discussion from turning into a public firestorm. “That’s like having a emergency relief plan in place for a natural disaster,” Uslan said. “You don’t actually think a disaster is going to happen, but they need to have the plan in place. In education, there are a lot of groups with variety of interests–such as home owners, teachers and students–and we need the freedom to evaluate proposals without that information coming into public view immediately.”