Fanying Jen The Fanying Jen Network Search The Fanying Jen Network FYNET Services About The Fanying Jen Network Contact FYNET
Part II. Hardware / Software 1992 - 1995
[Previous] [Index] [Next]

Networking and Windows 3.11

I pretty much mastered MS-DOS, its applications, and the hardware of the computer by the end of 1994. I now have multiple computers, the same 486DX-33, a 486SX-33, and a few older computers including 286s and XTs. Having now at least two relativity modern computers, the 486s mainly, I wanted to be able to connect the two computers together. The original motivation was not to learn client-server architectures but it was as simple as wanting to play computer games over the network and transfer files more quickly.

My first attempt at creating a network was to use both computers now running MS-DOS and Windows 3.11 for Workgroups and connect them via a serial null modem cable. The setup up was slow but it was faster than transferring files via floppy disks. The original network used both NetBEUI and IPX but not TCP/IP yet. I haven't known about TCP/IP at that time. It was a fun setup and my friends and I played games on it and send jokes via the network to each other.

I thought that was it for me in networking. Little did I know that at a computer show my parents believing a salesman purchase a network set. I thought at first why spend more money for networking if I can network computers for less like via serial cables. Though neither my parent nor I knew the network kit that was purchase I found out a year later was an Ethernet kit and it was much faster than the serial cable. I set up the network this time using Ethernet and the minute I transfer a large file across the network I realized just how fast it was. A 10MB file can be transferred in a minute as opposed to 45 minutes via the serial cable.


Modem and The BBS

I have also acquired a modem for free from a friend of mine during the time I was working on networking. It was an old 2400 baud modem that he has no use for since upgrading his to 9600 baud. The first thing I did was searching through the computer magazines for BBS numbers to dial and started dialing them. The connection was slow of course but I had fun doing some mailing and downloading small shareware programs.


Two Events, The Web and Linux

Two events occurred that really became a turning point for me. Up until then, I mainly work on computers as merely stand alone machines and whatever network I have done at that time was for very simple tasks. The first event was The Cooper Union Internship program and the second was a chance encounter with a gentleman by the name of Rexford Ballard at a Software Etc in Herald Square (6 Ave, 34th Street). Cooper Union was the place I was exposed to two new things. The first was the Internet presented in an entirely new way namely the World Wide Web. The second was a Unix machine, which was a Sun Microsystem Sparc 5 workstation running Solaris.


The Internet and Internet Access

The World Wide Web as it was called was a radically different experience for me. I have traditionally navigated the Internet in text mode via the BBS using a slip connection. Using a small piece of software called a browser, the Internet was now presented in such a way that it allows me to traverse more quickly. Navigating the web also requires that the machine be directly connected to the Internet. This means that a slip connection through a terminal program would not work and another form of connection was required. That connection was called PPP. PPP allows my computer to be directly connected to the Internet as opposed to being a dumb terminal to a server which has the Internet connection which makes tasks such as downloading data much easier and simpler. This begins my long trek from the small PC stand alone world to a more networked environment with new concepts and architectures.


The Linux Encounter

The computer that I first saw the World Wide Web was not on PC running MS-DOS and Windows but on a Sun Sparc Station running Solaris. I did not know it at the time but the computer I was operating was a computer running Unix. I became very intrigued by this new system and I asked one of the staff members the cost of such a computer. The staff member responded, "$25,000", needless to say I was shocked. The idea that a computer could cost more than a car dumbfounded me so I asked for a version of Unix that could run on a PC such as my 486. His response was SCO Unix. With that in mine I began looking for SCO Unix to purchase but I discovered the cost to acquire a fully functional SCO Unix OS and its associated application was around $4,000 at the time.

However as luck would have it, I encounter a person by the name of Rexford Ballard in a Software Etc during June of 1995. The encounter I believed happened when he overheard me talking to an employee about Unix software particularly SCO Unix. Software Etc and other consumer electronic retail establishments only stock MS-DOS/Windows and a small number of Macintosh software titles. Unix was a term most people have not even heard of let only know what it was. Well this gentleman, Rexford Ballard, told me about another Unix-like operating system that had a cost of zero, can freely and legally be distributed, and allows access to the source code. That Unix-like operating system was called Linux and the year was June of 1995. It was two full month prior to the release of Windows 95. Now I thought to myself, Linux, a free operating system with no major applications available and one that I have never heard of versus SCO, a commercial operating system with support and major applications including Word Perfect. At first I balked at Linux for not having applications and no real support. I need people to help me with Unix if I were be to successful and Unix people are very hard to come by especially in the middle of Herald Square. Rex Ballard advised me to acquire a publication that contained the actual Linux software distribution and start from there.


First Linux Experience

I made momentous decision in going with Linux as oppose to SCO Unix. The reason simply boils down to economics. SCO Unix was simply not affordable and a Sun Sparc was totally out of the question. Linux, despite its lack of commercial applications, could be acquired at no cost. The availability of the source code simply did not interest me initially since I was not familiar with the concept of distributing source code for free. I purchased a publication called "Linux Unleashed" from SAM Publishing which contained Linux Slackware 2.2.0 distribution at a local CompUSA store. The computer I first installed Linux on was my first 486DX computer now with 32MB of memory and 3 hard disks totalling 2GB. I went ahead to install Linux on my computer, configuring it as a dual boot system, not knowing what to expect. My first installation was terrible, Linux destroyed all of my data on the DOS partition and nothing worked right after installation. I believe I must have went through five or six complete reinstallations of Linux before it worked at least half-well.

My first experience with Linux was one of shock. The expectation I had for Linux was a fully developed operating system with a nice GUI interface, plenty of applications, and ease configuring peripherals. None of this was true with Linux. The first task I had to accomplish was to configure a GUI interface for Linux which in the Unix world is called X-Windows. The PC world called it Microsoft Windows. That endeavor took me three months of struggling just to get some lousy graphics display especially compared to the commercial Unix systems. The next task was to somehow acquire applications especially Word Perfect to do my school work. Well Word Perfect did not work on Linux and the best applications I could find were the Andrew User Interface System. Printing was the worst of my troubles with Linux. It wasn't until three years later that I finally was able to print directly from Linux. I struggled with Linux for the next 6 months with no real support or help other than the book I purchased and Rex Ballard when I get a chance to contact him.


Rex Ballard and The Linux Server

Rex Ballard was really the only person who really assisted me in any real capacity during my struggling days of Linux. He worked at Standard & Poor's/McGraw-Hill at the time of my encounter with him and worked in the Information Technology industry for over 20 years, longer than I have been alive. Rex assisted me in operating Linux despite my lack of understanding in the Linux operating system and even came to my house and actually configured software for Linux. With his assistance, I slowly began to increase my proficiency with Linux to the point where within 12 months, I could implement a simple TCP/IP based client- server network. The simple client-server network consisted of one MS-DOS/Windows 3.11 client and one Linux server running FTP, HTTP, and Samba. Though I had a terrible experience with Linux as my desktop mostly due to Linux's immaturity, I had far more success with using Linux as a server.

Linux as a server operating system was much more easier to implement than as a desktop. The server had far fewer variables and needed only to perform a few specialize tasks well. In my case, it just had to serve information quickly, reliability, and most of all cheaply. Rex Ballard aided me in getting my Linux web server to operate by actually sitting in front of the computer and configuring the web server. I later learned that a server is called a daemon in the Unix/Linux world. The first web server I ran was not Apache but was the NCSA Web Server. Apache did not exist at that time but I soon started to use Apache when NCSA no longer updated their web server software.

The support from Rex Ballard enabled me to climb the steep learning curve in 12 months. I learned the basics in operating Linux as a user and later as a system administrator. My 486DX for the first three years operated as a regular home PC with MS-DOS and Windows now has been transformed into a server. I soon able to put my server on the Internet full time for the time. The server first began providing web, mail, ftp, and shell services in late 1996. Rex Ballard also provided me information in my attempt to build a dynamic interactive web site. Such advise included the use of HTML, CGI scripts, the Perl language, WAIS search engine, web spiders, and simple databases. Like experiencing Linux for the first time, I initially struggled in creating an interactive web site particularly in developing CGI scripts.

My first project was an Internet search engine. I wanted to mimic Yahoo's search engine which was the best at that time. The project was completed using GPL software from people on the Internet with the capability to search the Internet and archive information. Unfortunately, there was a problem. My inexperience in network programming and Linux created a situation whereby my 486DX-33 would overload web servers and crash them. The strange thing is that the servers that were overloaded and eventually crashed were all Windows NT 4.0 servers and discovered that I crashed over 250 Windows NT servers. My first inadvertent DoS or Denied of Service and I got into trouble for it.


Windows 95

The crashing of so many Windows NT servers got me wondering especially after the fact I learned the machines were running Pentium 90 and 100MHz computers on Windows NT. Those Pentium computers should be more powerful and capable than my slower and older 486 computer and yet the 486 was still able to overload and crash them. I began to realized the difference was the software being used. The Pentium computers were running Microsoft Windows as their operating system and my 486 was running Linux as its operating system. Therefore, I thought, if Linux running on an old 486 could overload a more powerful Pentium running Windows NT then Linux has to be a more efficient operating system than Windows NT. That was when I really discovered the power of Linux and later Unix. Linux was able to extend the life of my 486 at no monetary cost and it came at a good time for me.

Windows 95, after intense premarketing, has been introduced in August of 1995. I was at first very excited along with many of my friends by this since it would be the first time I would have an affordable 32-bit desktop system. Linux already occupied the servers. In addition, it would also mean that the entire computer industry would leave 16 bit behind. Unfortunately, Windows 95 despite its friendly GUI desktop compared to Linux was not as fulfilling as Linux. Windows 95 promised us that computer crashes and 16-bit would be a thing of the past. Well, that was not the case. Though Windows 95 crashed less than Windows 3.11, it still crashes once a day. Windows 3.11 crashes every two hours. Linux crashed once a month and the crashed even less as I improve my Linux skills. Windows 95 also did not fulfill the promise of being completely 32-bit. It was nothing more than a 32-bit GUI on top of a 16-bit MS-DOS. I thought that was a great waste of money. After spending $200 for Windows 95 thinking that was a completely revamped operating system with all of the problems associated with MS-DOS/Windows 3.1 were solved and then discovering that it was merely a facade was a real let down.

The real kicker with Windows 95 was the system requirement. The software requires a minimum of a 386DX-16 with 4MB of memory and with a recommended requirement of a 486DX2-66 with 8MB of memory. In reality, the operating system really requires a Pentium computer with 16MB of memory. Bear in mind that the first Pentium computers that came out in late 1995 costs over $5000. That means I not only I have to discard my 486 computer after only three years of use but have to spend $5000 for a new computer just to run a 32-bit operating system. Fortuitously, the exposure of Linux however difficult it might be at first has allowed me to be able to accomplish my goals without costing me a fortune.


Windows NT 4.0

Windows 95 was not the only thing I was curious about. I was also interested in learning Windows NT as well. My first experience with Windows NT was crashing them when I ran my web spider on Linux for the first time. I felt that Windows NT simply was not efficient enough to make better use of the faster hardware available to it. In addition to its inefficiency, a client license must be purchased for every user that would connect to the server. The idea of paying for every user that would connect to the server when the number of users could not be predicted as in the case of operating a web server was ludicrous. There was no way to control cost and cost was the biggest factor for me especially since I was at school at the time. I still wanted to learn Windows NT anyway since it provided a multi-user, 32-bit desktop environment as opposed to the brain-dead facade of Windows 95. Windows NT also crashed less than Windows 95 but was nowhere near the reliability of Linux.

I purchased Windows NT via a student discount at around mid 1997. I was able to install Windows NT now that the cost of a Pentium computer was a lot lower. The computer I installed Windows NT on was a Pentium 100 with 32MB of memory and a 1GB hard drive. My experience with Windows NT was that it was fine for a single user desktop and could operate major applications including Word Perfect. I even tried to implement the same interactive dynamic content on Windows NT as I had done on Linux just out of curiosity. Obviously I could not use Windows NT due to its technical insufficiencies and economical structure.

My implementation would use Internet Information Server 3.0, Index Server, and other software. However, not only creating the dynamic content was prohibitively difficult but the software was a real drain on resources. My faster Pentium 100 simply could not keep up with the 486 running Linux. Certainly the implementation could not handle as many users when compared to the Linux implementation. That was when I felt that Windows anything should never be used as a server. The software was simply not designed for the task. As a result, with the rapid improvements in Linux especially on the desktop, Linux would not only dominate the server but would spread to the desktop as well.


Hacked! Computer Security

Computer security was an issue I never really gave much thought of at first. My only real concept of computer security was BIOS /screensaver passwords and computer viruses. The reason was that I first started in MS-DOS and Windows 3.1 which was designed as a single user machine. Therefore, MS-DOS and Windows 3.1 had no security. Windows 95 had merely a security facade but was also a single user system with a nice GUI display. Linux and Unix on the other hand were designed as a multi user operating system and thus required that the users be logically separated from each other. Such concepts include user accounts protected by passwords, file system permission hierarchy, resource quotas, auditing, and host access permission (firewalls, filters, tcpwrappers). Windows NT was designed for corporate use and thus did have security as in user accounts, file system permission hierarchy, and system auditing. However, Windows NT, unlike Linux and Unix, was not designed as a multi user operating system and lack many security components that would be necessary in a server operation.

I became aware of computer security, like many people, the hard way. My Linux server got quickly cracked and DoS by people on the Internet. Obviously I was not a very happy camper but it was this experience that I began to hardened my server and learn basic computer security concepts. The first sets of concepts I learned was limiting the vectors of attack and information leaks, and operational security. I first limit the vectors of attack by disabling nonessential services such as chargen, echo, netstat among others. Remote and local information leaks were managed by implementing password shadowing, pgp encryption, and by removing nonessential information from public view. Operational security was perhaps the most important concept I learn. Operational security is the way the computer is operated that insures that greatest integrity and confidentially of the information.

My operational security was atrocious. I originally login remotely over a public network via telnet as root. No wonder I was rooted or cracked. In addition, I would use terrible passwords without using password shadowing and would download software from the Internet and ran them without checking the source code. Furthermore, I never patched my systems. Well, after numerous cracks and DoS, I wised up and implemented all sorts of security. Such security includes shadowing, tcpwrappers, encryption, resource quotas, software maintenance, locking accounts especially root, and most importantly ask myself the question; "does this action expose me to attacks?" In addition, I began to arm myself with intelligence by joining security oriented mailing list and search the Internet for information on vulnerabilities which could affect.

This trial by fire and the humble security measures has lead me to learn and understand not only security concepts and risk management but also their implementations and implications as well.


FreeBSD

I have gotten very experienced at running a Linux server by mid 1998. Security implementation and system enhancement has made what was an experimental server into a full fledged and reliable information services operation. I was spending so much of my time maintaining and operating my server that I was working on ways to automate the maintenance tasks. My good friend, Michael Sawicki (Saturated Networks Inc.), whom I introduced Linux to him turned around and introduce me to FreeBSD. I was at first reluctant to use FreeBSD since I believed that Linux was the best thing in the world. Linux has opened the doors of opportunity for me that would not be possible with Microsoft Windows so I got defensive about FreeBSD. Nevertheless I know Michael is not a zealous person plus I taught him Linux anyway, so I decided to take his word for it and try FreeBSD.

Michael gave me a copy of FreeBSD 4.0 and assisted me in the installation and configuration process for FreeBSD. The only real difference in the installation process was the partitioning scheme. Linux uses the partitioning structure from MS-DOS which means four and only four primary partitions. For partitions greater than four requires the use of extended partitions and that was limited to four as well. FreeBSD uses the Solaris partitioning structure whereby a single primary partition is created and inside that primary partition, additional subpartitions can be created. The subpartition is where the data and swap are actually stored. This approach circumvents the limitation of the MS-DOS partitioning structure and allows the creation of large number of subpartitions in a cleaner organized fashion. Other than the disk partitioning, difference between Linux and FreeBSD was almost negligible.

This was not the main difference between Linux and FreeBSD. Michael then presents the real magic of FreeBSD. The magic of FreeBSD was the software management capability. Up until FreeBSD, I managed the software on a Linux server manually which took up a lot of time. The maintenance time increases greatly as the number of information services I operate increases. The first component of the FreeBSD software management is that the entire operating system source code can be updated in one command via "cvsup". The second component is via the FreeBSD ports system which automates the installation, updating, and deinstallation of third party software packages. The first obvious advantage was automation. I could script the entire process via cron. However, I decided for security purposes to automate up to the tracking step. The reason is I wanted to review the source code and any associated patches prior to allow installation. I learned my lessons for not running software from untrusted sources. In addition to the software management, I could do the entire process remotely. Though I could also update Linux and third party software remotely, it was often done manually and there was a risk particularly with kernels that the system might not restart correctly. The automated processes for Linux weren't that mature at that time. The best one I could think of was Debian Linux. With FreeBSD, it freed my time and my worry about managing software. As a result, my server's reliability increased even further.

FreeBSD also had another advantage over a Linux distribution like Debian was that the operating system updates come in the form of source code. The source code was then compiled on the local machine and installed. The advantage to this process over an update process of binaries was that the source code could be optimized for the system on which it will be run on. The reason was that Linux has become very popular in 1998, thus, many Linux distributions were trying to cater to the masses. Well the majority of users already have Pentiums and Pentium IIs. As a result, the new Linux distributions were optimized for the newer hardware at the expense of the older systems. My 486DX-33 by 1999 was already considered obsolete. However, my pride prevented me from retiring my 486. FreeBSD was able to do what Linux had done for my 486 and that was to extend the life of the machine.


Solaris Revisited

My first exposure with a Unix system was a Sun Sparc 5 running Solaris. Unfortunately, the high cost of both the machine and software prevented me from acquiring Solaris skills. However, my best teacher from my former high school, Mr. Goldman, notifies me that they received a donation of Sun hardware from a company and ask me to help them. I was overjoyed. I was finally going to get a chance to actually work on a Sun Sparc. Mr. Goldman lend me the machines for the summer so I could get acquainted with them. This was also the area where Linux showed its value. The skills I acquired while struggling with Linux enabled me to master other Unix operating systems including FreeBSD and Solaris, and all for the cost of a $50 book and Rex Ballard.

I learned the system administration skills of Solaris included installation, configuration, and now security. Learning Solaris was so much more easier than Linux especially since I gained Unix skills from Linux! Within three months, I had three fully functional Sparcs and I had full access to them. It was a power rush. In time, I would learn other Unix operating systems though I would mostly concentrate on Linux, FreeBSD, and Solaris.


Information System Design

Information system design wasn't something I intended to do in 1996. My interest was to attempt to mimic an interactive website such as Yahoo. Unfortunately, I did not have the skills required to accomplish such a task. I even did not know how to install and configure a web server let alone write CGI scripts and HTML forms. Rex Ballard kindly gave me a hand to configure my web server for me and gave good information on writing CGI scripts and HTML forms. Rex also told me about the WAIS search engine which included a robot.

Despite the good information Rex Ballard provided, I lacked the understanding to comprehend his advise. Like the struggle I had with Linux, I struggle again tried to implement even a simple CGI script that would accept values from a form. The WAIS search engine was a distant thought as I worked on the CGI scripts. With fortitude though, I was able to successfully implement a WAIS search engine with a Web interface and a robot. The result of the WAIS search engine was not what I hoped for. The search results from the WAIS engine were never really accurate and it was slow. It was not a very effective search engine and upgrading the server's hardware was not an option.

I tried other search engines including Harvest and Glimpse but eventually settled on a search engine called htDig from San Diego University. This search engine along with my increased skills in writing web interfaces finally allowed me to create a search engine that was customizable and delivered good comprehensive results quickly. I would finally complete the search engine and was ready to unleash it on the Internet. Unfortunately, my inexperience allowed the robot to overload and crash many Windows NT servers and I would learn the importance of testing! Other interactive components that I developed or modified from were a guestbook, a counter, chat rooms, and a web cam.

From these simple interactive components I progressed into increasingly more complex services. One of the most important concepts I learned in designing information systems was the multi-tier architecture approach. A multi-tier architecture is the separation of functions into smaller but simpler modular components. My original information system architecture consists of a web interface which directly manipulated the data in the data store. This architecture was fine for simple applications but the minute features needed to be added or modifications needed to be made, the complexities increased dramatically. I have had abandoned projects due to increased complexity. The new approach of splitting one monolithic application into many smaller modular components allowed me to create more complex applications quickly, efficiency, reliably, and most important securely.

The modular based architecture required much more preplanning than a single monolithic approach. Each module must be built and assembled with minimal risk of failure. Furthermore, the module must be maintainable and modifiable without risking the integrity of the entire application. The concept that I came to understand was the API or application programmable interface and data flow architectures. An API is a protocol for application development which determines how the pieces are to be assembled together. A data flow architecture is the design of how information traverses the application and returns a particular desired output. Knowing the data flow process allows proper design and implementation which would reduce the risk of rogue information input from compromising the security integrity of the application.

The modules itself were simplified to three basic components: the input, the output, and the function. The module would accept input values from the outside, perform the function desired, and return the desired output. The advantage to this technique is as long as the API itself is not changed, any modification to the function itself will not affect any of the code outside of the module. This allowed me to modify applications quickly and with fewer variables to consider, reduced risks of failure and simpler debugging of the software. In addition to the reduced maintenance of the application, I could now reuse the modules I had written before. Since I could accurately predict the output for an input, I could quickly create new applications and worry less about software errors. My new understanding in basic information design and implementations bestowed on me to ability to create ever more complex information systems using more complex software components.


[Previous] [Index] [Next]

Copyright © 1996, 2002 Fanying Jen. All Rights Reserved.