Go to TogaWare.com Home Page. GNU/Linux Desktop Survival Guide
by Graham Williams
Duck Duck Go

ANet Install Log


Subsections

The initial installation was on a test machine with very restricted network access. Purpose was to test, configure and document the installation.

Standard install (see Section 4.1.3). Boot from DVD. Choose guided full repartition of the hard disk.

Install: lang=English, location=Australia, kb=American English, network=eth0 (also available were eth2, ..., eth4), hostname=anet01, partition=Guided, automatic, entire disk.

The partition automatically chosen was:
/ 279M sda1
/usr 5.0G sda5
/var 3.0G sda6
/home 119G sda9
/tmp 403M sda8
swap 18G sda7

TZ=Sydney.

Set root passwd, user account, apt install from DVD with tasksel selection of Desktop Environment, Web Server, File Server, SQL Database, and Standard System. SMB install noted that WINS settings can be obtained from DHCP, so choose that (although there was a recommendation to then install dhcp3-client for this, but this was not done).

Reboot and Gnome (GDM) started no problem.

Continue installing from DVD to install wajig, configure sudo, and all the rest!

Installed Sun's jdk 1.6.0:

# mkdir /usr/local/sun
# cd /usr/local/sun
# sh /home/share/java-6u1-linux-amd64.bin
  Agree to the license if you do - but beware it contains limitations.
# update-alternatives --install /usr/bin/javac javac\
  /usr/local/sun/jdk1.6.0/bin/javac 120
# update-alternatives --install /usr/bin/java java\
  /usr/local/sun/jdk1.6.0/bin/java 120

We should then really do the same for all of the other binaries in /usr/local/sun/jdk1.6.0/bin, but a quick shortcut is to simply put them all into /usr/local/bin:

# cd /usr/local/bin
# ln -s /usr/local/sun/jdk1.6.0/bin/* .

The bteq application is used to connect to a Teradata data warehouse. Its installation will confirm that the data warehouse connection can be established, and hence, SAS/ACCESS Teradata can establish a connection.

Teradata do not support Debian, but the driver works. Install the libraries provided for the i386 architecture:



rpm2cpio tdicu-01.01.02.00-1.i386.rpm . | cpio -idv
rpm2cpio TeraGSS_redhatlinux-i386-06.02.00.00-1.i386.rpm . | cpio -idv
rpm2cpio piom-02.04.00.00-1.i386.rpm . | cpio -idv
rpm2cpio cliv2-04.08.02.00-1.i386.rpm . | cpio -idv
rpm2cpio bteq-08.02.04.00-1.i386.rpm . | cpio -idv

sudo cp -R opt/teradata /opt
sudo install usr/bin/bteq  /usr/bin/

sudo install usr/lib/* /usr/lib32

sudo ln -s /usr/lib32/errmsg.cat /usr/lib/errmsg.cat
sudo ln -s /usr/lib32/clispb.dat /usr/lib/clispb.dat
sudo ln -s /opt/teradata/teragss/redhatlinux-i386/06.02.00.00 \
           /opt/teradata/teragss/redhatlinux-i386/client

rm -rf ./opt ./usr

Then simply start bteq:

$ bteq

.LOGON hostname/user

A test driver was supplied by Teradata for RedHat. The package was installed under Debian using alien (via wajig):

$ wajig rpminstall tdodbc-03.06.00.00-1.x86_64.rpm

It complains that scripts won't be generated unless the -scripts option is ued, but when used we get some script errors that have not yet been explored. The library seems to be in the right place, but haven't tested it as yet.

Sample configuration files appear in /usr/odbc/unixodbc.ini and /usr/odbc/odbc.ini.

A Debian package can be created with:

$ alien -d --scripts tdodbc-03.06.00.00-1.x86_64.rpm

to create tdodbc-03.06.00.00-2-amd64.deb. An install of this though complains about scm:socal being an invalid user in a chown, many times. But we can look at the scripts and see what it is trying to do.

Testing will involve creating one's own /.odbc.ini, and placing the contents of the sample odbcinst.ini into /etc/odbcinst.ini (is it required in that location?). But tdata.so complains that it can't find libodbcinst.so, which is there in /usr/lib64/, but perhaps this is a problem with LD_LIBRARY_PATH things in R?

Copyright © 1995-2018 Togaware Pty Ltd
Support further development through the purchase of the PDF version of the book.
Brought to you by Togaware and the author of the open source software Rattle and wajig.
Also the author of Data Mining with Rattle and Essentials of Data Science.