Treehouse Software Customer Case Study:
Tallahassee Community College

The FCCSC created "Integrow", a suite of automated and Web-enabled administrative software systems that use a variety of Software AG tools, including ADABAS, NATURAL, and EntireX. Integrow was designed to support community colleges and allowed the creation of a real-time Web-based class registration and information component. Integrow also helps communication between eight IBM mainframes located at various schools. Furthermore, TSI products also play a key role in maintaining the FCCSC's commitment to this system.

The following is a recent discussion between Ron Baumgardner, Mainframe/Web Technology Coordinator at TCC, Mike Robeck, TCC Computer Systems Analyst, and Daniel Sycalik, Senior Technical Representative for TSI.

Ron, please tell us a little about Tallahassee Community College.

Since 1966, TCC has offered post-secondary instruction of the highest quality for citizens of Leon, Gadsden, and Wakulla counties (the primary service area), along with students from throughout the state, nation, and world.

Around 80 percent of TCC's enrollment of over 14,500 students is in the Associate in Arts transfer program. TCC was recently listed 22nd among the nation's top producers of A.A. graduates. The largest feeder institution to Florida State University, TCC also has an excellent relationship with Florida A&M University and other universities in Florida. Nearly three-fourths of the College's A.A. graduates transfer into the State University System the next year, the highest percentage in the Florida Community College System.

Please describe the ADABAS/mainframe environment at TCC.

The TCC mainframe environment consists of an IBM 2003 model 2C5 with an 89 MIPS processor running both VSE/ESA and VM operating systems. TCC is running the full SAG software product suite (ADABAS 7, NATURAL 314, Predict, Construct, etc.). The method of communicating with the web is with EntireX 532 patch level 14.

The web server environment is running under the Windows 2000 operating system and web pages are served through IIS 5.0 and/or JRun 4.0 web servers.

Our Web applications are written in JAVA and provide the following services: student registration, payment of fees via credit card and financial aid, grades/records, transcript requests, change of address and PIN, student e-mail account activation, class rosters with web pictures, student advising, and FACTS (Florida Academic Counseling and Tracking for Students). All applications are either consortium based, enhanced consortium products, or applications written specifically for TCC. As a member of the FCCSC, we work diligently in applying the current web standards adopted by the consortium, so that the work we do here not only benefits us, but members of the consortium as well. Conversely, we also rely on the other schools for developing other applications that we may wish to implement at TCC some time in the future.

With the FCCSC and Integrow ensuring the long-term commitment to SAG ADABAS/NATURAL applications for TCC, how does tRelational and DPS assist in this commitment?

The use of tRelational and DPS has increased the longevity of the SAG products at TCC, as this gives us a mechanism to provide data to many different venues in a much better way than we could in the past without having to change the core product (ADABAS). In other words, the ease of transferring ADABAS data using TSI products makes it unnecessary to look at moving away from our much trusted SAG products.

Please describe the recent project initiative involving tRelational and DPS.

The initial project that we started with was "Class Rosters for the Faculty" on the Web. The faculty wanted student pictures and pertinent data about the students in their classes to be easily available.

Also during this time period, the presidency at TCC was handed over to Dr. Bill Law. As a result, it became clear early on that we needed to provide a long term data warehousing and strategic objective to deploy application data to Microsoft SQL Server to satisfy query and reporting requirements.

Please describe the application data targeted and the objectives for deployment to SQL Server.

There were two approaches we could take to get the data to the Web:

  • Use EntireX for real-time access. Although we could have retrieved all the data from the mainframe, this was not practical from a mainframe I/O efficiency standpoint.

  • Replicate the data into a RDBMS environment. With response time and I/O efficiency in mind, we decided to rely on Microsoft SQL Server 2000 to store the data. During registration periods, the mainframe is in high demand, and by moving the data, we were able to relieve pressure off of the mainframe for improved response time.

We evaluated two ways to replicate the data:

  1. Use mainframe extract programs and move all of the data twice a day.

  2. Use tRelational/DPS to update only the data that had changed. This approach was adopted because we knew time would become an important issue for the transfer of data to the SQL server environment.

How many ADABAS Files were modeled, and how many SQL Sever tables resulted?

There were 22 physical ADABAS files (containing 7,000,754 records) that resulted in 85 SQL Server tables.

Approximately how many ADABAS records were extracted, and how many RDBMS rows resulted from DPS Materialization? Also, how long did it take to Materialize?

Materialization extracts 4,463,094 rows in five hours then the data is loaded into SQL Server in under one hour.

Can you please describe your automated DPS Propagation Implementation?

Actually, it's a five-step process:

  1. Force FLIP of PLOG.

  2. User Exit 2 in ADABAS was modified to both hold the tape in the tape drive after successfully copying the PLOG dataset and to release a DPS job from the VSE reader queue to create the DPS dataset used to update the SQL server.

  3. The file is then FTPed to a server.

  4. A JAVA Script program that runs all of the time determines that a file has been transferred, and then takes the file and applies the updates to the SQL server.

  5. The JAVA program then archives the dataset for future reference (if needed).

How often do you perform DPS Propagation, and on average, how long does it take to execute the end-to-end process?

We propagate twice a day, and it takes approximately five minutes each time.

How long did the project take, and were there any significant obstacles?

The original project took about two months to complete. Since this is the first time we used tRelational/DPS in a production environment, it took us about 3-4 weeks (coding/testing time) to set up files, create the Java programs on the server side, and automate the propagation process. However, this was a one-time investment. Future projects will take about 10-15 minutes per file to set up for extract (as opposed to several days per file to code/test extract programs).

How many end users are utilizing the SQL Server data?

There are over 12,000 students and 733 faculty/staff.

What features of tRelational and/or DPS factored significantly in the success of the project?

The ability to create an automated process that only moves changes to data, as opposed to moving all the data every time; the ability to add files being migrated without having to write additional extract programs; and administration is much easier using tRelational.

Please describe how tRelational and DPS will factor in the support of the future application "ports" provided by the FCCSC.

tRelational/DPS is being investigated as the tool of choice for the FCCSC to transfer data to alternate RDBMS platforms. This plan is in its infancy at the FCCSC and has great potential.

Can you speculate if you could have been successful without DPS Propagation and how long it would have taken without tRelational and DPS?

We would have been successful without tRelational/DPS, but it would have been a lot harder on human and machine resources and would have taken longer to implement. Also, the nightly processing would be several hours longer every night without the TSI products.

How do you envision continued use of tRelational and DPS?

tRelational/DPS will be considered for every new Web initiative that requires mainframe data. If it makes sense, we will use it. Other organizations in the FCCSC are planning to replace existing extract programs and implement data warehouses, and tRelational/DPS will play a key/lead role in supporting statistics and research initiatives.

 

 

Office Location

2605 Nicholson Road, Suite 1230
Sewickley, PA 15143
USA

Contact Us

General Email:
tsi@treehouse.com
Sales Department:
sales@treehouse.com
Support Center:
support@treehouse.com

Connect with Treehouse