December 27, 2015 – 3:34 pm
“Survival of the Fitters.”
I don’t know whom to credit for that wonderful line. I first heard it from Fred Snow, a vice president with the distributor TechData, but he says he picked it up somewhere else.
In any case, I’ve been stealing the line with glee, for not only is it a nice play on words, it’s a perfect description of the PC marketplace these days. It describes the results of both last year’s downward pricing pressure on PC hardware vendors and the market’s swing to Windows applications on software vendors.
That phrase came to me again last month. I’d gone off into the woods, so to speak, for a week, to get away from my daily routine and think about where management of the corporate computing function is headed.
“Survival of the Fitters” indeed. How better could we describe the upheavals in IS management over the past decade — and especially the last two or three years?
So many of the themes that have defined business computing during the ’80s and ’90s are really variations on that idea.
With the advent of PCs, …
December 8, 2015 – 8:28 am
Hard drive recovery costs may vary from case to case. Prices cannot be the same for every type of hard drive and for every kind of hard drive failure. There can be a range of factors which decide what would be the total data recovery cost will be for you. Some of these factors are listed here.
One of the very basic factors on which the cost of hard drive recovery depends is the type of hard drive itself. Hard drives differ in interface; e.g. recovering data from an SCSI drive would be more costly than that from an IDE drive. Similarly, it is obvious that there would be different costs associated with data recovery from drives of different sizes and model.
Another common factor deciding hard drive recovery cost to a great extent is of course the type of damage rendered to the drive. A physically damaged drive usually requires more effort for data recovery than a logically damaged drive.
The type of operating system also has an impact on the cost of data recovery. Data recovery pricing for the UNIX file system would …
December 4, 2015 – 8:17 am
When DOS 2.0 was brand-new in 1983, I was using Lotus 1-2-3 to maintain a mailing list. It was easier than using EDLIN and the DOS Sort utility, but it still seemed like a clumsy, error-prone approach.
That’s when I decided to get acquainted with my first database package — Microrim’s R:base. My reaction was love at first prompt.
That early experience with R:base gave me some convictions about the proper way to configure a database. Those convictions persist to this day, although I was often forced to use other tools because of a client’s preference.
When I recently had the chance to get reacquainted with…
November 28, 2015 – 9:06 pm
The U.S. semiconductor industry has staged a startling comeback from its dark days of the mid-1980s.
It was in 1986 that American semiconductor companies, which had once enjoyed a 70 percent share of worldwide sales, watched their market share slip below 40 percent and their number of dynamic RAM manufacturers dwindle from 11 to two. Japan quickly capitalized on the erosion of the U.S. semiconductor industry and vaulted to the top.
“People seriously thought there wasn’t going to be a semiconductor industry in the United States by the end…
November 18, 2015 – 11:42 am
Vendors love to solve problems by declaring them solved. Declarations don’t require much research and development, don’t take much time to produce and don’t cost much to make.
Unfortunately, simply saying something is so doesn’t make it so.
Client/server database systems, for example, are not necessarily distributed database systems. No matter what their vendors might say — and some vendors are trying to equate the two — the two are different.
Spotting a client/server database system is pretty easy. A server machine runs database server software. Some machines networked to that server run client software that uses the data-management services of the server database software. …
November 6, 2015 – 7:24 am
Oracle Corp. customers will see the benefit of the company’s new Glue middleware immediately: Glue’s architecture is built to take advantage of the optimization features of the Oracle database server. But while Glue has a strong architecture, its success will depend on its acceptance and use by other software vendors.
PC Week Labs examined a beta version of Oracle’s middleware, a software layer that allows front-end software to talk to back-end databases. Glue takes middleware a step further by including E-mail and personal digital assistants (PDAs) as data sources that can be linked into the corporate data network.…
October 30, 2015 – 11:57 am
Digital Equipment Corp.’s upcoming desktop systems, based on its new Alpha processors, will be the dream machines of the Windows NT world, according to an examination of a preproduction unit by Geekstir.
Expected to be released when Microsoft Corp.’s Windows NT is announced in the second quarter, the Alpha AXP 21064-based system examined by the Labs uses a minitower case and standard PC components and will cost between $7,000 and $10,000, depending on configuration. DEC also has plans for both lower- and higher-end Alpha systems.
Side-by-side comparisons with a 25/50MHz 486DX2-based system running our test release of Windows NT were no contest. The Alpha system, still far from final optimal condition, was considerably faster than the 486 PC. We compared the Alpha and 486 systems by running simultaneous generations of fractal images on each, using the fractal demonstration program included with the Windows NT Software Development Kit (SDK).
The test was heavily floating-point-math intensive, stressing one of Alpha’s strong features.
Our test system ran at 125MHz, short of the 150MHz expected in release-level systems. The NT Alpha compiler — and therefore the applications we tested — …
October 15, 2015 – 4:27 pm
Football is big business, and the Super Bowl is the biggest of the big. The National Football League, a PC Week Corporate Lab Partner, spent two weeks building a miniature village near the site of the Super Bowl, complete with stores, houses, offices, an amusement park and a satellite-dish forest. And this village is becoming increasingly computerized.
The day before the Super Bowl, NFL officials took us on a tour of the official statistics control center. During the game, this small room is a hotbed of activity as each play, participant and result is entered into a Xenix-based multiuser system. This system, written by a small consulting firm called Emphasys, runs on a network of Zenith MultisPort laptops. During the game, statistical summaries are selected by the NFL and synchronized with the in-house…