Skip to main content

Full text of "Interactive Computing newsletters"

See other formats


July/August 1979 
Volume 5, Number 4 


Interactive Computing 

The Newsletter of the Association of Computer Users 



V 


Editorial 

Page 2 

Benchmarks of 
Small Computer Systems 

Page 4 


Comparing 
Programming Languages 

Page 8 






















It’s About Time! 

Editorial 


nZ rS m Z a ay7nto S ourZ712-issue series o/ benchmark.tots and 

the /ruits o/ our /abors. It is actually a summary of a summary, h & ^ ^ 
date. 


Six-Issue Summary Compares the Systems 

The mid-year summary issue of the benchmark 
series, from which the extract on the following 
pages’was drawn, offers a concise summary of the 
hard data collected during the individual bench¬ 
mark runs. Three types of tests are reported on in 
the individual reports: CPU intensive benchmarks, 
I/O intensive routines, and “real life problems, but 
only the results of the “real life” problems are 
contained in this issue. Nonetheless, the overview 
provided by this comparison is at times startling. 
Differences in system architecture and operating 
system software are sharply revealed in the CPU 
and I/O intensive tests. The differences between 
systems resulted in factors of 10 and 15 commonly 
separating the fastest and slowest systems. (In 
other words, the fastest one was 10 to 15 times 
faster than the slowest one!) Yet all the systems 
tested are within about 30% of each other in fully- 
configured price. 

Such statistics illustrate the fallacy that systems in 
the same price range, and in comparable confi¬ 
gurations, necessarily provide similar performance. 
While some offerings excell in number-crunching, 
others have streamlined I/O operations or en¬ 
hanced language capabilities. The benchmark tests 
bring out these differences, allowing us to assess 
the merits of each machine for a particular type of 
application. 

Profile pages devoted to each system are a very 
useful feature of the full summary report. These 
pages list the best features and drawbacks, and the 
conclusions reached by the testing agency, Real 
Decisions Corporation. It’s important to remember 
that the suitability for a particular application 


depends on many other factors in addition to 
speed. Software packages and vendor support play 
a vital role in many applications. This is particularly 
true when evaluating systems intended for small 
business users as opposed to those slated for an 
engineering development lab or scientific research 
facility. The difference in user sophistication makes 
the software support aspect a crucial consideration 
in the business application, while in many cases the 
scientific user does not expect to find ready-made 
applications programs. 

User Reactions Are Important 

User reactions to individual systems, also con¬ 
tained in the full reports on each system, help to 
lend a great deal of perspective to the benchmark 
reports. These remarks were obtained from ten or 
more different customers of each system, and they 
are a very important part of each report. By 
profiling the types of users attracted to each 
system, and getting the benefit of their experience 
with the product line, we begin to see the big 
picture surrounding the details provided by the 
benchmark study. 

The individual reports also provide a detailed 
portrait of the hardware and software provided 
with each system. Such aspects as CPU confi¬ 
guration, storage facilities, peripheral devices, 
operating systems, available languages, utilities, 
package software, editors, and documentation are 
given full-page treatment within the reports on 
each system. User comments are noted for each of 
these parts of the total system, and summarized 
near the end of each report. Finally, Real Decisions 
Corporation outlines its own conclusions based on 
the benchmark results, user comments, and their 













experiences in running the tests for us. 

Vendors Respond Favorably 

Despite the intense scrutiny to which the bench¬ 
mark studies have subjected these small com¬ 
puters, the manufacturers involved have re¬ 
sponded very positively. As a whole, they have 
complemented our efforts, and none have chal¬ 
lenged or criticized the findings of the reports. In 
most cases, they’ve gone out of their way to be 
hospitable towards the lengthy benchmark proc¬ 
ess. The reactions we’ve received seem to indicate 
that our attempts to be unbiased and impartial 
have been fairly effective. This has been a major 
objective of the studies ... to obtain the most 
complete view possible while appraising various 
systems with what is, after all, a decidedly 
comparative methodology. 

Our plans for the immediate future include (1) 
completing the remaining six issues of this first 
benchmark series, (2) engaging another indepen¬ 
dent firm to test larger systems for us, and (3) 
allowing RDC to also begin testing smaller systems. 
Each series of reports will contain twelve issues on 
individual systems, two summary reports which will 
compare the findings, and a loose-leaf binder. 

We urge all of our members to subscribe to these 
efforts. Those joining in the present series midway 
will, of course, receive all of the reports already 
issued, as well as the six yet to come. And should 
any errors of fact be discovered in a particular 
report, all subscribers would receive the appro¬ 
priate corrections. We feel confident these bench¬ 
mark reports are the most accurate, impartial 
analysis of a group of competitors that anyone in 
the industry can offer. While we’re not ready yet to 
claim that we’ve started the Kentucky Derby of the 
computer industry, we do at least have our first set 
of race results. Now we can actually begin 
comparing horses. It’s about time! hs 



VOLUME 1. NUMBER 7. MAY 1979 


1 

05) ^ 

w 


mm 



fa 

tlloW 

feir 

I© 


iffiDna 

8anmi 


<3 

©j 

5 



bffii 




6 


6 Issue Summary 


• IBM 5110 • HEWLETT PACKARD System 45 

• DATAPOINT 1170 • TEXAS INSTRUMENTS FS990/10 

• WANG2200VP • DEC PDP UV03 


BENCHMARK REPORT is published and distributed by The Association of Small Computer Users, a not-for-profit us _ . 

authored by Real Decisions Corporation, an independent consulting firm. ASCLTs distribution of BENCHMARK REPORT is solely for the 
information and independent evaluation of its members, and does not in any way constitute verification of the data contained, concurrence 
with any of the conclusions herein, or endorsement of the products mentioned. ©Copyright 1979, Real Decisions Corporation. No part of 
this report may be reproduced without prior written permission from both Real Decisions Corporation and the Association of Small 
Computer Users. First class postage paid at Boulder, Colorado 80301. 


Twenty page summary report provided to 
BENCHMARK REPORT subscribers. 


























Benchmark Tests of Six Popular Small Computer Systems 


by 

The Staff Of 

Real Decisions Corporation 


This extract from the BENCHMARK REPORT 
series published by ACU contains the test results 
of three “Real Life Problems” which were recently 
run on six single-user small computers. Created 
and run by Real Decisions Corporation (RDC), the 
three benchmark programs included here are: 

• C-l — a scientific/engineering problem 

• C-2 — a new product planning problem 

• C-3 — an accounts receivable problem 

The results for these test problems are reported 
for each of the following small computers: 

• IBM 5110 

• Datapoint 1170 

• Wang 2200VP 

• Hewlett-Packard System 45 

• Texas Instruments FS990/10 

• DEC PDP-11V03 

During the benchmarking process on each system, 
the RDC analyst exercised great care to obtain fair 
and consistent results. All benchmarks were run in 
BASIC with results displayed on the screen. 
Programs were loaded into memory and the 
stopwatch was readied. The RUN key and the 
stopwatch were pressed simultaneously; when 
results appeared on the screen, the stopwatch was 
stopped and the elapsed run time recorded. 

Although the benchmark tests included in this 
extract reveal dramatic differences in the way each 
system handles a given problem, the results 
reported here should not be misinterpreted as the 
“whole story” about any of these six systems. 
Additional benchmarks to evaluate how a system 
handles CPU intensive and I/O intensive tasks 
have also been run for ACU’s Benchmark Report 
series, and these results should be taken into 
account to round out the total comparative 
standing of any particular system. 

In general, benchmarks are just one facet of 
judging the capabilities of a small computer. 
Prospective users are advised to investigate other 
areas of equal importance when they are evaluating 


a minicomputer system as a possible purchase. 
These other areas concern the larger framework of 
customer support provided by the vendor — 
including applications packages, maintenance serv¬ 
ices, documentation and user education. 

Overview: Systems As Configured 
For Benchmarks 

IBM 5110 — The 5110 system tested was a 32K 
byte Model 2 processor with a hardwired BASIC- 
only language, CRT and keyboard, a dual floppy 
drive and a 120 cps printer. Price as configured at 
the time of the test was $19,975; however, recent 
reductions announced by IBM would now set this 
figure at $16,435. 

DATAPOINT 1170 — The Datapoint 1170 had a 
processor with 48K bytes of memory, CRT and 
keyboard, two diskette drives and an 80 cps 
printer. The BASIC language was software imple¬ 
mented. Price as tested was $20,330. 

WANG 2200VP — The Wang 2200VP had a 32K 
byte processor housed in a workstation with three 
diskette drives. The CRT and keyboard sat on top 
of the workstation and a 120 cps printer completed 
the system. The BASIC language was hardwired, 
and the total price as configured was $20,700. 

HEWLETT-PACKARD SYSTEM 45 — The HP 

System 45 was a compact desktop unit with a 
memory of 29,882 bytes and a hardwired BASIC 
language. CRT and keyboard, a built-in thermal 
printer and two diskette units completed the 
system. Price as tested was $23,650. The newly- 
announced System 45B decreases this price, since 
memory increments can now be bought for less 
than half the cost of previous memory options. 

TEXAS INSTRUMENTS FS990/10 — TI’s 
FS990/10 comes in a packaged configuration which 
includes 64K bytes of memory, two diskette drives, 
CRT and Keyboard, and a 150 cps printer. The 
BASIC language was software implemented, and 
the whole system’s price as tested was $16,745. 















SIX SMALL 
SYSTEMS 




A. IBM 5110 

B. Texas Instruments FS990/10 

C. Datapoint 1170 

D. DEC PDP-11V03 

E. Wang 2200VP 

F. Hewlett-Packard System 45 



- 5 - 
















DEC PDP-11V03 — Digital’s PDP-11V03 had a 
memory capacity of 56K bytes, CRT and key¬ 
board, with a dual floppy drive. Although no printer 
was used on the machine we tested, the price of a 
printer was added to make this configuration 
comparable to the others. With printer, the total 
price was $14,930. 


The Benchmark Problems 

C-l: Scientific/Engineering Problem 

This program solves a system of linear equations, 
using the Gauss-Jordan method of elimination. The 
program sets up the following system of ‘N’ 
equations with ‘N’ unknowns. 

O.lxi + 0.1x2 + 0.1x3 + ... + O.Ixn = 0.2 

O.lxi + 0.3x2 + 0.3x3 + ... + 0.3xn = 0.4 

O.lxi + 0.3x2 + 0.5x3 + ... + 0.5xn = 0.6 


O.lxi + 0.3x2 + 0.5x3 + ... + 9.9xn = 10.0 

To show that the run has been executed 
successfully, the values of xi, X2, and xn are printed 
at the end of the execution. 


C-2: New Product Planning Problem 


This program models the relationship between 
product production costs and profitability over the 
range of the next four years. A base line run is 
established, and several parameters are varied in a 
“what-if” mode on subsequent runs. Program 
output is printed in a standard report format of 
report line items across column years. The model’s 
display line items are: 


-Units Sold 
-Selling Price 
-Revenue 
-Raw Materials 
-Direct Labor 
-Packaging 


-Distribution 
-Gross Profits 
-Fixed Costs 
-Net Before Taxes 
-Taxes Payable 
-Net Income 


C-3: Accounts Receivable Problem 

In this job, an accounts receivable file of 50 records 
is created. Each record has 10 fields: customer 
number, salesman number, year-to-date sales, 
prior month sales (five fields), payments and credit 
limit. The file is updated randomly 10 times by 
customer number for sales amounts and payments. 
A report is displayed with billing detail, including 
company, salesman, year-to-date sales, credit limit, 
amount outstanding and sales by month. 

Capsule Conclusions 

IBM 5100 — On straight capabilities as tested 
by these benchmarks, the 5100’s performance 
was not outstanding. However, for those users 
who are comfortable with the total line of IBM 
products and familiar with the IBM mode of 
operation, the 5110 is now a price-competitive 
system — with the “bonus” of double-density, 
dual-sided diskettes for maximum use of the 
floppy disk system. 

DATAPOINT 1170 — In these particular tests, 
the Datapoint 1170 had the highest costs for 
accounts receivable and scientific/engineering pro¬ 
blems. But users who would like to start small in 
acquiring computer power while still keeping up¬ 
grade options open should check out the Data¬ 
point family of products, which offers a large 
array of compatible hardware/software combina¬ 
tions. 

WANG 2200VP — Clearly, Wang was an 
outstanding performer in these tests. The 2200VP 
demonstrated its powerful number-crunching capa¬ 
bilities by capturing two “firsts” and missing a third 
first place by only 1.4 seconds. Users with high 
requirements for I/O handling are advised to 
investigate this aspect of the system as reported in 
additional benchmark runs done for the Bench¬ 
mark Report series. 


(Continued — Page 11) 










BENCHMARK 

RESULTS 


C-l: SCIEWTIFIC/EWGINEERIN6 PROBLEM 



38:27.5 



DATAPOINT WANG 
1170 2200VP 


* TI could not Ann thii 
problem In BASIC on the 
TS990/10 because o{ memori y 
limitation*. TV* Ae*ult* 
ion. thii problem u*ing 
FORTRAN and POWER BASIC 
one available in Apnil'* 
Benchmark Repont. 


14:43.4 



HP TI DEC 

SYSTEM 45 FS990/10 PDP-11V03 


SYSTEMS TESTED 


BENCHMARK SUMMARY 



IBM 

5110 

DATAPOINT 

1170 

WANG 

2200VP 

HP 

SYSTEM 45 

TI 

FS990/10 

DEC 

PDP-11V03 


RESULTS 
Min:Sec 

RESULTS 

M1n:Sec 

RESULTS 
Min:Sec 

RESULTS 
Min:Sec 

RESULTS 
Min:Sec 

RESULTS 

M1n:Sec 

REAL LIFE 
PROBLEMS 







c-l 

Eng. 

29:47.2 

38:27.5 

2:05.8 

4:38.9 

-• 

14:43.4 

C-2 

Prod. 

24.2 

17.3 

1.2 

9.3 

23.2 

45.8 

C-3 A CC -/ 
Rec. 

4:11.0 

6:50.4 

3:20.0 

5:05.8 

3:18.6 

4:14.0 


•This program could not be loaded due to memory limitations. 

COMMENTARY : 


c-l.In thii *cienti{ic/engineering pnoblem Wang'* 2200VP wa* twice 

a* {ait a* it* neareit competitor, HP'* Syitem 45. Wang demon- 
itnated *upenioAity in the oa.ho. o{ complex calculation*. 


c -2 . Wang'* power{ul CPU capabilitiei again gained the 2200VP {iut 

place in thii "real Hie" problem concerning product planning. 
The ipread between Wang and the *lom*t reiult (PEC) i* a {.actor 
o{ 58. 


c-3. Remit* were clo*e*t in thi* account* receivable problem. TV* 

TS990/10 beat out Wang {or the top *pot by only 1.4 iecond*. 
Similarly, only three *econd* *epanated IBM’* third place 
remit {rom DEC'* {ourth place. 


COMPARATIVE STANDINGS 



C- . ?: . NEW PRODUCT PROBLEM C-3: ACCOUNTS RECEIVABLE PROBLEM 




- 7 - 




































































Comparing Alternative Programming Languages 
APL versus FORTRAN & COBOL 

By Leon P. Stevens 
Standard Oil Company (Indiana) 

In conventional computer programming for business applications, COBOL is probably the most commonly 
used computer language, while for scientific and many other purposes, FORTRAN has long been the 
standard. At my company, Standard Oil (Indiana), these two are by far the most widely used. But are they 
the best languages for every application? I wanted to see how COBOL and FORTRAN stood up against a 
very special language, APL. Developed in the early 60’s by Kenneth Iverson of IBM, APL (which stands for 
A Programming Language) contains, among other features, a very powerful set of instructions highly suited 
to numerical analysis and complex mathematical problems. Given a suitable occasion for its use, could APL 
save time and effort during program development? We decided to put it to the test. 


The Application 

Here at Standard Oil (Indiana) the Treasurer’s 
Department requested that a system be devised to 
analyze and report on the profitability of domestic 
and international banks. The system would extract 
data from existing files, manipulate it using 
complex logical structures, and summarize the 
findings in report form for output via line printer. 

During the analysis stage, we found that new 
executive procedures would be needed to handle 
such a system. It was while writing those 
procedures that we realized this application would 
provide a suitable test case for the comparison of 
APL with our conventional COBOL/FORTRAN 
approach. This was due to two factors: the 
relatively small number of man-hours needed to 
develop the system, and the fact that large batch- 
type computer operations would not be involved in 
running the final product. (Since APL is an 
interactive language, and won’t run under some 
batch mode operating systems, a job requiring 
such operations would not have been a good test 
of its capabilities.) 

Our original estimate of the project showed that 
with conventional techniques, the bank reporting 
system would require 80 man-hours of analysis and 
200 man-hours of design and implementation. The 
cost of these phases was placed at $2,568 and 
$6,420, respectively. Computer charges were 
estimated at $3,580, bringing the total estimated 
cost of the project to $12,568. 

The COBOL/FORTRAN Approach 

Based on an analysis of the desired system, 


conducted by our staff, a project plan was 
developed. Under the plan, the system would use 
an IBM time-sharing system located in Tulsa, 
Oklahoma, and would require two separate 
programs. One program, to be written in COBOL, 
would extract financial data from an existing data 
file and create a new file with the extracted data. A 
second program, written in FORTRAN, would then 
use the new file as input. The FORTRAN program 
would manipulate the data using arithmetic cal¬ 
culations and complex logical statements, pro¬ 
ducing an output file which could be printed on a 
Data-100 printer located in our Chicago office. 

The two programs were then written and tested at 
our Chicago office, using our batch computer 
facilities which run under an MVS (Multiple Virtual 
Storage) operating system. When the testing was 
completed, the source programs were then 
transmitted to the Tulsa computer, recompiled, 
and installed on that computer’s VM/CMS (Virtual 
Machine/Conversational Monitor System) opera¬ 
ting system. While the two operating systems differ 
greatly, the exercise proved to be a textbook 
example of system compatibility. The Tulsa 
computer was able to successfully run the 
programs, produce an output file, and transmit the 
file to Chicago, where it was printed at our General 
Office. 

The APL Approach 

While development of the COBOL/FORTRAN 
system was in progress, I began work on a 
separate system which was implemented in APL. 
The APL version was to entirely and completely 
meet all the user objectives which had been defined 
for the system. The approach I took was similar to 











that used by our other staff, although some minor 
details did differ. 

In the APL version, the same data file was used as 
the starting point. A data extraction phase was 
designed to produce an intermediate file, which 
then would be input to a data manipulation and 
report writing phase. I designed the data extraction 
phase to provide as much flexibility as possible in 
selecting which banks would be analyzed by the 
report. Like the COBOL/FORTRAN version, the 
banks under study could be grouped by country. 
However, the APL version also allowed the user to 
specify individual banks, something the COBOL/ 
FORTRAN version did not. Of course, this could 
be added to the conventional system if desired. 

Another feature of the APL version was its ability 
to perform multiple extraction phases before 
entering the data manipulation and report writing 
phase. Again, this added to the flexibility of the 
system in selecting the data which was to appear in 
the final report schedules. 

Like the COBOL/FORTRAN version, the APL 
system ran on the Tulsa computer under the 
VM/CMS operating system. The results were then 
transmitted to Chicago, where they were printed 
on the Data-100 printer. 

Comparing the Systems 

In the end, two very similar financial analysis 
systems were developed, one written in COBOL 
and FORTRAN, the other in APL. After they were 
completed and checked out, we compared the 
efforts and costs expended during system develop¬ 
ment, as well as the expense of operating the two 
systems. We found that the APL system was 69% 
cheaper to develop in terms of computer charges, 
and required 73% fewer man-hours. Here is how 
the development phases of the two systems 
compared: 


System 

Man-Hours 

Development 



Cost 

APL 

45 

$1,740 

COBOL/ 



FORTRAN 

167 

$5,627 


In order to compare the two systems’ speed and 
cost of execution, we ran a series of three reports. 
All were done on the same facilities in Tulsa, and 
all were performed after 8 p.m. on the same 
evening. Because they were run after 8 o’clock, the 
user load on the computer was constant, and 
remained fairly low. Since the runs were done the 
same evening, there was no chance that any 
software changes could be made to the operating 
system between runs. Thus there were no artificial 
variances in the results. 

We selected three sets of groupings for the runs: 
the first analyzed French banks only; the second, 
French and Canadian banks; and the third, 
French, Canadian, and U.S. banks. In this way, we 
were able to estimate the performance of the two 
systems when running small, medium, and large 


tasks. This was 

the result: 



Elapsed Time (Seconds) 

Report Set 

FORTRAN/ 

COBOL 

APL 

France 

103 

140 

France, Canada 

121 

242 

France, Canada, 

US 190 

341 


As you can see, the APL system ran considerably 
slower than the conventional FORTRAN/COBOL 
solution ... in the worst case, it took twice as long 
to complete the task. It’s not clear exactly why this 
happened, but it may be that the APL version 
required some system resource that caused queue 
formation, and thus delayed completion. 

We were surprised to learn, however, that 
although the APL system ran slower, it did not 
always cost more in CPU charges to execute the 
same task. Figuring our VM/CMS system as $2.00 








per computing unit, we came up with these cost 
comparisons: 

Computing Cost 


Report Set FORTRAN/ APL 

COBOL 

France $ 2.14 $ 7.12 

France, Canada 8.98 11.16 

France, Canada, US 19.36 15.78 


These benchmark comparisons show that the APL 
system had a much higher setup cost than the 
COBOL/FORTRAN system did, but once up and 
running, it imposed incremental costs at half the 
rate of the conventional system. Thus it is likely 
that if a much larger task were run, the APL 
version would come out as the cheaper system, 
though in terms of real time it would probably still 
run slower. However, the delays in execution 
caused by the requirements of the APL operation 
(the theorized queue formation noted above) 
would not tend to increase the cost of execution, 
since no additional CPU time would be required. 

Documentation Considerations 

While the conventional COBOL/FORTRAN sys¬ 
tem included program documentation for use in 
maintaining the software, the APL system was 
developed with very little documentation. In the 
normal sense, this means the APL system, as it 
was written, is not maintainable. This could have 
been remedied by adding documentation, but given 
the sharply reduced development cost of APL, 
another solution is possible. The APL program 
could be considered “throwaway code.” 

The concept of throwaway code means that if a 
change in the function of the system were required, 
the portion of the system code affected would not 
be modified. Instead, that area of the code would 
be thrown away, and a replacement section written 
from scratch. While this is very expensive for 
conventional programming languages, with APL 
the cost of the throwaway code might turn out to 
be trivial. 


Conclusions 

Our comparison of development time and opera¬ 
ting costs of using APL versus the more commonly 
used COBOL and FORTRAN languages brought 
out some interesting facts. For the development 
and execution of some applications, APL can 
provide significant savings over conventional pro¬ 
gramming languages. In the case of this study, APL 
required only 27% of the hours and only 31% of the 
total cost needed for program development. 

Despite its lower development costs, our study 
showed that an APL program may cost consider¬ 
ably more to run than a comparable batch 
program. However, APL could be of value in such 
cases by providing an actual example of the system 
in operation. The user department could then 
study the results and, having access to actual 
system output, could more precisely state the 
system requirements. This more accurate system 
definition could then be brought to conventional 
system development, resulting in better utilization 
of information systems personnel. 

Aside from the possibilities of using APL as an 
analysis tool, the language provides an ideal vehicle 
for actual implementation of small financial models, 
small operating reports, statistical studies, and the 
like. It could also be used effectively for engineering 
models and reports, and budget preparation and 
consolidation. 

The IBM version of APL which we used does have 
some limitations. These include difficulty in pro¬ 
cessing large tape and disk files. While APL does 
run under the VM/CMS or MVS/VSPC (MVS/ 
VSPC is Multiple Virtual Systems/Virtual System 
Personal Computing, another IBM operating sys¬ 
tem), it cannot run in batch mode under MVS. In 
addition, the language could be prohibitively 
expensive for file inquiry, when compared to CICS 
(IBM’s Customer Information Control System). 

After considering all the factors, our study showed 


- 10 - 







that, at least within my company, there is a need 
for APL, and that APL can indeed greatly reduce 
program development efforts. But can it be an 
effective long-term substitute for such workhorse 
computer languages as COBOL and FORTRAN? 
At this point, I have to say it is still an open 
question. 


(Benchmark Tests — Continued from Page 6) 


HEWLETT-PACKARD SYSTEM 45 — Overall, 
System 45 performed competently in these pro¬ 
blems, second only to Wang in two problems and 
below the median in the third test. HP received 
general high ratings in hardware reliability and 
w portability — a desktop and an electrical outlet are 
all the requirements for this compact, high- 
performance package to be ready to go to work. 

TEXAS INSTRUMENTS FS990/10 — Because 
TPs interpretive BASIC takes up so much memory 
space, the FS990/10 could not load the C-l 
problem. Results in POWER BASIC and FOR¬ 
TRAN — both reported in the April issue of 
Benchmark Report — were competitive and 
impressive. TI came up number one (by 1.4 
seconds) in the accounts receivable problem and 
scored below the median in the new product 
planning problem. Users feel TI offers a cost- 
effective product through its network of OEMs. 

DIGITAL EQUIPMENT CORPORATION — 
PDP-11V03 — The PDP-llV03’s strongest 
showing was in the scientific/engineering problem, 
where it took third place. In the other two 
problems it was below the median, with the 
weakest area being new product planning. Users 
should be aware that the price quoted is without 
any applications software, which is added by the 
OEMs. Upward compatibility through the broad 
PDP-11 line provides extensive upgrade potential, 
although the V03 itself has a memory limitation of 
64K bytes. 


CORPORATE ASSOCIATE MEMBERS* 

ADP Network Services 
American Terminal Leasing 
Avco Computer Services 
Boeing Computer Services Company 
CallData Systems 

Citibank - Interactive Computing Center 
Corporate Time-Sharing Services, Inc. 
Datanetwork, Honeywell, Inc. 

G. E. Information Services Company 
Informatics - Data Services Div. 

Insco Systems Corporation 
I. P. Sharp Associates, Ltd. 

Litton Computer Services 
Martin Marietta Data Systems** 

Metrocom Inc. 

National Computer Network 
On-Line Systems, Inc. 

Quantum Science Corporation 
R.A.I.R., Inc. 

Rapidata, Inc. 

Scientific Time-Sharing Corporation 
SDC Search Service 
Sun Information Services 
Telenet Communications Corporation 
Time-Sharing Resources, Inc. 

Trendata 

United Computing Systems 

University Computing Company 

Vocal Interface 

Warner Computer Systems 

Western Union - Data Services Company 

Zeta Research 

* Previous Corporate Associate Members of ATSU 
are now shown as Corporate Associate Members 
ofACU. 

**New Member. 

Companies supplying computing products or services 
are eligible to apply for Corporate Associate Member¬ 
ship by writing to the Association. 


Correction: 

The Volume Numbers on the last two issues of “Interactive 
Computing ” were stated incorrectly. The correct numbers are: 

March/April 1979 — Volume 5, Number 2 
May/June 1979 — Volume 5, Number 3 


-li- 







Association of Computer Users — Chapters, Local Contacts and Special Interest Contacts 


ALABAMA 

Winston Brooke 
Anniston — TSS Local Contact 
Brooke, Freeman, Berry & McBrayer 
( 205 ) 238-1040 

ARKANSAS 

Gene Dugger 

Searcy — SCS Local Contact 
Harding College 
( 501 ) 268-6161 

CALIFORNIA 

Richard Dumas 

Mountain View — TSS Local Contact 
Commodity Research Institute 
( 415 ) 941-4646 

Frederick Gallegos 
Los Angeles — TSS Local Contact 
U.S. Gen’l Accounting Office 
( 213 ) 688-3809 

Don Hatch 

San Diego — SCS Local Contact 
Christian Mgmt. Consulting Services 
( 714 ) 293-3200 

Jim Rigby 

Brea — SCS Local Contact 
Rodgers & Rigby, CPA’s 
( 714 ) 990-3613 

Frank Slaton 

San Bernardino — TSS Local Contact 
California State College 
( 714 ) 887-7293 

COLORADO 

Michael J. O’Connell 
Denver — SCS Local Contact 
Assoc, of Operating Rm. Nurses 
( 303 ) 755-6300 

CONNECTICUT 

Frank Chew 

Greenwich — TSS Local Contact 
Amax, Inc. 

( 203 ) 622-2824 

Charles J. Clock, Jr. 

Special Interest Contact for 
Educational Applications 
West Hartford Public Schools 
( 203 ) 236-6081 

FLORIDA 

William A. Rousseau 

Pompano Beach — TSS Local Contact 

Alpine Engineered Products, Inc. 

( 305 ) 781-3333 

J. L. VanGoethem 

Miami — SCS Local Contact 

Ryder System, Inc. 

( 305 ) 593-3726 

HAWAII 

Richard Riehle 

Honolulu — SCS Local Contact 
The Printout 
( 808 ) 536-6532 

IDAHO 

Rick Simon 

Boise — TSS Local Contact 
Morrison-Knudsen Company 
( 208 ) 345-5000 

ILLINOIS 

* Leon Stevens 

Chicago - SCS Chairman, SCS 
and TSS Local Contact 
Standard Oil Company 
( 312 ) 856-6689 

John A. Koziol 

Chicago — TSS & SCS Local Contact 
Continental Materials Corp. 

( 312 ) 565-0100 


IOWA 

James E. Lewis 

Marshalltown — SCS Local Contact 
Iowa Valley Comm. College District 
( 515 ) 752-4643 

KENTUCKY 

Clyde Jenkins 

Special Interest Contact for APL 
Humana Inc. 

( 502 ) 589-3790 

LOUISIANA 

W. D. Landry 

Abbeville — SCS Local Contact 
Coastal Chemical Co., Inc. 

( 504 ) 893-3862 

MARYLAND 

R. G. Korbeck 

Baltimore — TSS Local Contact 
Baltimore Gas and Electric Company 
( 301 ) 234-6687 

MASSACHUSETTS 

* Stuart Lipoff 

Boston — TSS Local Contact and 
Special Interest Contact for 
Software Standards 
Arthur D. Little, Inc. 

( 617 ) 864-5770 

METRO WASHINGTON, DC. 

Frank E. Rockwell 

Lanham — SCS Local Contact 

Astro Data Systems, Inc. 

( 301 ) 577-5838 

A. Steven Wolf 
DC — TSS Local Contact 
U.S. General Accounting Office 
( 202 ) 655-4000 

MICHIGAN 

J. Ben Friberg 

Grand Rapids — TSS Local Contact 
Rapidstan Inc. 

( 616 ) 451-6682 

Tom Hunt 

Cadillac — TSS Local Contact 
Kysor Industrial Corp. 

( 616 ) 775-4646 

* Larry Leslie 

Kalamazoo — TSS Vice-Chairman 
and Special Interest Contact for 
Time-Sharing Administrators 
Upjohn Company 
( 616 ) 323-4000 

MINNESOTA 

L. R. Bakewell 

St. Paul — SCS Local Contact 
Real Estate Dynamics, Inc. 

( 612 ) 698-8891 

MISSOURI 

Dann E. Kroeger 

Kansas City — SCS Local Contact 
Townsend Communications, Inc. 
( 816 ) 454-9660 

NEW JERSEY 

Jim Fitzpatrick 
Special Interest Contact for 
Data Base Applications 
American Broadcasting Corp. 

( 201 ) 488-2345 

Robert J. Loring 

Haddonfield — SCS Local Contact 
Cardiac Long-Term Monitoring SVC 
( 609 ) 795-2220 

* Bennett Meyer 

Wayne — SCS Vice-Chairman, and 
Special Interest Contact for 
Data Security 
Singer-Kearfott 
( 201 ) 256-4000 


Samuel A. Scharff 
Englewood — SCS Local Contact 
Consulting Engineer 
( 201 ) 569-8332 

NEW YORK 

Dr. Dina Bedi 

Special Interest Contact for 
Educational Applications 
Baruch College 
( 212 ) 725-3196 

Terri Gendron 

Briarcliff Manor — TSS Local Contact 
Phillips Laboratories 
( 914 ) 762-0300 

Samuel Leonard 

Elmira — TSS Local Contact 

Thatcher Glass Mfg. Co. 

( 607 ) 737-3459 

Philip N. Sussman 

New York City — TSS & SCS Local Contact 
International Paper Company 
( 212 ) 490-5827 

NEW YORK CITY CHAPTER 
Executive Board: 

Aram Bedrosian 
TWA 

Bion Bierer 
Bristol Myers 
Victor Bittman 

Chase Manhattan 
Charles Browning 
Phelps Dodge 
Dennis Callahan 

Goldman Sachs & Co. 

Chester Frankfeldt 
Continental Group 
Carl Heimowitz 

Harcourt Brace Jovanovich 
Alan Kornbluth 

American Express 
Susan McCain 

Morgan Guaranty 
Arthur Schneyman 
Mobil Oil 
Indira Singh 

Salomon Brothers 
Philip Sussman 

International Paper Co. 

OHIO 

Dennis Bender 

Cincinnati — TSS Local Contact 
Procter & Gamble 
( 513 ) 562-2469 

Ed Casper 

Cleveland — TSS Chapter President 
Diamond Shamrock Corp. 

( 216 ) 694-5566 

Howard Tureff 

Cleveland — TSS Local Contact 
Diamond Shamrock Corp. 

( 216 ) 694-5963 

ONTARIO 

* David Wilson 

Toronto — TSS Chairman, and 
TSS and SCS Local Contact 
P.S. Ross & Partners 
( 416 ) 363-8281 

OREGON 

Paul Gehlar 

Salem — SCS Local Contact 
Oregon Fruit Products Co. 

( 503 ) 581-6211 

PENNSYLVANIA 

Dale Hummer 

Pittsburgh — TSS Local Contact 
Westinghouse Electric Corp. 

( 412 ) 273-6169 


*Council Members for 1979, in addition to officers listed below. 


D. T. Wu 

Philadelphia — TSS Local Contact 
DuPont De Nemours & Co. 

( 215 ) 339-6307 

QUEBEC 

Andre Pitre 

St-Laurent — TSS Local Contact 
S.D.S. Inc. 

( 514 ) 744-4454 

TEXAS 

Ralph N. Bussard 

Houston — TSS & SCS Local Contact 
Price Waterhouse & Company 
( 713 ) 654-4100 

Ankarath Unni 
Dallas — SCS Local Contact 
Sun Production Company 
( 214 ) 739-9301 

UTAH 

Melvin D. Nimer 

Salt Lake/Provo — SCS Local Contact 
McNally Mountain States Steel Company 
( 801 ) 785-5085 

VIRGINIA 

John Hudson 

Danville — TSS & SCS Local Contact 
Dan River Inc. 

( 804 ) 799-7101 

W. W. McChesney 
Alexandria — SCS Local Contact 
Country Legend Stores, Inc. 

( 703 ) 370-9850 

WISCONSIN 

Anil K. Bhala 

Green Bay — SCS Local Contact 
L. D. Schreiber Cheese Co. 

( 414 ) 437-7601 

Jack Kochie 

Racine — SCS Local Contact 
Medical Engineering 
( 414 ) 639-7205 

David J. Ritter 

LaCrosse — SCS Local Contact 
LaCrosse Garment Mfg. Co. 

( 608 ) 785-1400 

John J. Stewart 

Wausau — SCS Local Contact 

Van Ert Electric Co., Inc. 

( 715 ) 845-4308 

Paul Thoppil 

Milwaukee — TSS Local Contact 
RTE Corporation 
( 414 ) 547-1251 

Robert Whitney 

Eau Claire — SCS Local Contact 
Owen Ayres & Associates, Inc. 

( 715 ) 834-3161 

LOCAL CONTACTS WANTED 

Become a local contact for your area. 
Your name and telephone number will be 
listed on this page in each issue of 
Interactive Computing, enabling other 
members to contact you with their 
questions. Only users, not suppliers, are 
eligible to apply by writing to the 
Association. Please specify which of the 
following Sections you would like to serve 
for: • Time-Sharing Section 

• Small Computer Section 

• Midi Computer Section 

• Large Computer Section 

• Distributed Processing 

• Word Processing Section 

• Home and Hobbyist Section 


- 





The Newsletter of the Association of Computer Users 


Published every other month by the Association of Computer Users, Inc., formerly the Association of 
Time-Sharing Users and the Association of Small Computer Users, Copyright 1979, P. O. Box 9003, 
Boulder CO 80301, Telephone (303) 499-1722. Second Class postage paid at Boulder CO 80302. 

Hillel Segal, President; Stuart Lipoff, Vice President; Earl Carroll, Treasurer; Martin Neville, Secretary 

An independent non-profit association , providing a forum for the discussion of computing topics.