Determining which CPUs have been applied

The question of how to determine which CPUs have been applied to a database came up at a couple of presentations at the UKOUG, including mine.

A quick and dirty way (which I didn’t think of at the time) would be to look in the CPU directory:

c:> dir t:\oracle\ora92\cpu

11/06/2007  18:12     .
11/06/2007  18:12     ..
02/05/2006  17:30     CPUApr2006
30/04/2007  19:21     CPUApr2007
28/03/2006  18:51     CPUJan2006
......

Then to check which databases the catcpu script was run against, you can look in the directories themselves - the spoolfiles have the name of the database in the title:


c:> dir t:\oracle\ora92\cpu\CPUApr2007

30/04/2007  19:21            .
30/04/2007  19:21            ..
30/04/2007  18:19    4,918   APPLY_SID1_30Apr2007_19_19_49.log
30/04/2007  18:26    4,918   APPLY_SID2_30Apr2007_19_20_11.log 

Problem: TNS-03505: Failed to resolve name

We’re in the process of migrating from Oracle Names to LDAP, and found the following fairly trivial gotcha.

If you have a space between the word LDAP and the comma in the DIRECTORY_PATH, as below, then it won’t work

NAMES.DIRECTORY_PATH= ( LDAP , ONAMES, TNSNAMES) 

Oracle options - update from Niall

Subsequent to the previous post I got a note from Niall, saying that

at least in 10g you can query dba_feature_usage_statistics as follows

SELECT NAME,VERSION,DETECTED_USAGES FROM DBA_FEATURE_USAGE_STATISTICS WHERE DETECTED_USAGES > 0;

Thanks, Niall!

Most used tiddlywiki commands for MainMenu

I put this into my MainMenu tiddler to give me a quick reference for my most used tiddlywiki formatting commands.

!Cheat sheet
|''Bold''|{{{''text''}}}|
|__Uline__|{{{__text__}}}|
|//Italic//|{{{//text//}}}|
|Bullets|{{{*text}}}|
|No.s|{{{#text}}}|
|Heads|{{{!text}}}|
|Table|{{{|t|t|}}}|
|Quote|{{{>>>}}}|
|{{{Mono}}}|{{{{{{text}}}}}}|
|[[Tid]]|{{{[[Text]]}}}|
|[[Help|http://www.blogjones.com/TiddlyWikiTutorial.html#EasyToEdit]]|{{{[[t|url]]}}}|

DOS equivalent to unix find -name -print

I’m planning to type up a couple of DOS and Windows equivalents of some of my most used unix commands.

These are very much limited, partial equivalents. In this particular case, I’m not replicating all the functionality of the unix ‘find’, but 9 times out of 10 when I use find, I use it like this:


find . -name 'sqlnet.ora' -print

You could of course use the Windows Explorer search. However, this doesn’t help if you want to, say, save the results into a text file or process them in a batch script. And, also, I prefer the commandline!

Anyhow, my quick and dirty DOS equivalent is:


dir /B /S sqlnet.ora


Some explanation, if its needed:

  • The /S says do a recursive listing
  • The /B is actually do a ‘bare’ listing with no size or date info. In this context though its use is that it prefixes the directory onto the output for each file

DOS equivalent to unix find -name -print

I’m planning to type up a couple of DOS and Windows equivalents of some of my most used unix commands.

These are very much limited, partial equivalents. In this particular case, I’m not replicating all the functionality of the unix ‘find’, but 9 times out of 10 when I use find, I use it like this:

find . -name 'sqlnet.ora' -print

You could of course use the Windows Explorer search. However, this doesn’t help if you want to, say, save the results into a text file or process them in a batch script. And, also, I prefer the commandline!

Anyhow, my quick and dirty DOS equivalent is:

dir /B /S sqlnet.ora

Some explanation, if its needed:

  • The /S says do a recursive listing
  • The /B is actually do a ‘bare’ listing with no size or date info. In this context though its use is that it prefixes the directory onto the output for each file

Bookmarking Oracle Metalink in IE or Firefox

This may well be obvious, but….

If you try and save the link to a Metalink note whilst you are actually in the Note, it doesnt always work very well.

Typically you get:

  • Name:ShowDoc

  • URL:https://www.metalink.oracle.com/metalink/plsql/f?p=130:14:42099925743 7239999::::p14_database_id,p14_docid,p14_show_header,p14_show_help,p14_b lack_frame,p14_font:NOT,1066139.6,1,1,1,helvetica,yada,yada,yada

There are two problems with this:

  • the Name’s not very useful, and more significantly,
  • the URL won’t work

The URL doesn’t work because of the string of numbers prior to the 4 consecutive colon’s. My guess is that this represents the login credentials of your current session, and these won’t work once you terminate that session or after you’ve re-booted your PC.

I currently use a couple of workarounds for this. Both involve right-clicking on a link to the page you’re currently on, so one way or another you have to navigate away from that page. It’s a bit manky, but it does seem to work.

Either:

  1. Go to the top of the note
  2. click on the ‘Bookmark’ link (at the top right).
  3. This takes you into your own personal list of Metalink bookmarks.
  4. Right-clink on the bookmark you’ve just saved
  5. select ‘Bookmark this link’ (Firefox), ‘Add to Favourites’ (IE)

Sporadically I then delete all of the bookmarks on my Metalink bookmark page.

Or:

  1. Use the back arrow, to go back to the page you came from
  2. Right-clink on the bookmark you’ve just saved
  3. select ‘Bookmark this link’ (Firefox) or ‘Add to Favourites’ (IE)

You then get the title of the note in the Bookmark, and a working URL. You need to already be logged into Metalink for it to work, though.

I think I had this working with del.icio.us as well, but since there was some sort of change to the Metalink login, the URL of the page doesn’t persist through the login screen.

Bookmarking Oracle Metalink in IE or Firefox

This may well be obvious, but….

If you try and save the link to a Metalink note whilst you are actually in the Note, it doesnt always work very well.

Typically you get:

  • Name:ShowDoc

  • URL:https://www.metalink.oracle.com/metalink/plsql/f?p=130:14:42099925743 7239999::::p14_database_id,p14_docid,p14_show_header,p14_show_help,p14_b lack_frame,p14_font:NOT,1066139.6,1,1,1,helvetica,yada,yada,yada

There are two problems with this:

  • the Name’s not very useful, and more significantly,
  • the URL won’t work

The URL doesn’t work because of the string of numbers prior to the 4 consecutive colon’s. My guess is that this represents the login credentials of your current session, and these won’t work once you terminate that session or after you’ve re-booted your PC.

I currently use a couple of workarounds for this. Both involve right-clicking on a link to the page you’re currently on, so one way or another you have to navigate away from that page. It’s a bit manky, but it does seem to work.

Either:

  1. Go to the top of the note
  2. click on the ‘Bookmark’ link (at the top right).
  3. This takes you into your own personal list of Metalink bookmarks.
  4. Right-clink on the bookmark you’ve just saved
  5. select ‘Bookmark this link’ (Firefox), ‘Add to Favourites’ (IE)

Sporadically I then delete all of the bookmarks on my Metalink bookmark page.

Or:

  1. Use the back arrow, to go back to the page you came from
  2. Right-clink on the bookmark you’ve just saved
  3. select ‘Bookmark this link’ (Firefox) or ‘Add to Favourites’ (IE)

You then get the title of the note in the Bookmark, and a working URL. You need to already be logged into Metalink for it to work, though.

I think I had this working with del.icio.us as well, but since there was some sort of change to the Metalink login, the URL of the page doesn’t persist through the login screen.

DOS script to output contents of latest file matching a given spec

Some of our batch jobs write log information to standard output. Others create dated log files.

Our scheduling tool kind of works best if log information goes to standard output - it captures it then allows you to inspect it from console.

I wrote the following script to spool out the latest version of a given dated log file.

I called it show_log.bat So if you run it as follows:


cmd /k show_log d:\logfiles\my_batch_job_*.log

…it displays the latest my_batch_job_*.log that it can find, where * is any string - in our case typically a date of some sort. So I put a call to show_log at the end of each batch job which creates a dated log file and I can see it from the scheduler console too.

If you have a directory that is called my_batch_job_*.log it wouldn’t work so well - I had to use /S to get dir to return the full pathname for the file.


echo off

rem This routine accepts a file specification (in the
rem format you use for rem dir) as a paremeter, then
rem displays the last updated file matching that
rem specification to stdout
rem
rem It replaces log2stdout.pl

set DIR_STRING="%1"
echo Dir String %DIR_STRING%

rem
rem These couple of lines work out what the last
rem updated file is. The dir command orders by
rem the updated date. The /S means do a recursive
rem search. This is there to ensure that the dir
rem returns the full path name rather than to
rem actually search a directory structure.
rem
rem We would get unexpected results if the
rem dir_string does actually match
rem a directory.

set LAST_FILE=
for /f %%I in ('dir /B /N /S /OD %DIR_STRING% ') do set LAST_FILE=%%I
echo Last file is %LAST_FILE%

if "%LAST_FILE%"=="" exit

rem
rem Retreiving the date stamp of the file for display

for /f "tokens=1,2" %%I in ('dir %LAST_FILE% ^| findstr /V "Volume Directory ( ^$"')  do set DATE_CREATED=%%I %%J
echo Date created is %DATE_CREATED%

echo Going to output this file:
dir %LAST_FILE% | findstr /V "Volume Directory ( ^$"
echo -----

rem
rem Echoing line by line rather than simply using
rem 'type' because I wanted to prefix each line
rem with the files datestamp. The reason for this is
rem because I don't want there to be any confusion
rem as to what file is being looked at, either
rem because this script has failed or because
rem the file you're expecting hasn't actually
rem been created

for /F  "delims=" %%I in (%LAST_FILE%) do echo %DATE_CREATED% Log: %%I

exit


DOS script to output contents of latest file matching a given spec

Some of our batch jobs write log information to standard output. Others create dated log files.

Our scheduling tool kind of works best if log information goes to standard output - it captures it then allows you to inspect it from console.

I wrote the following script to spool out the latest version of a given dated log file.

I called it show_log.bat So if you run it as follows:


cmd /k show_log d:\logfiles\my_batch_job_*.log

…it displays the latest my_batch_job_*.log that it can find, where * is any string - in our case typically a date of some sort. So I put a call to show_log at the end of each batch job which creates a dated log file and I can see it from the scheduler console too.

If you have a directory that is called my_batch_job_*.log it wouldn’t work so well - I had to use /S to get dir to return the full pathname for the file.


echo off

rem This routine accepts a file specification (in the
rem format you use for rem dir) as a paremeter, then
rem displays the last updated file matching that
rem specification to stdout
rem
rem It replaces log2stdout.pl

set DIR_STRING="%1"
echo Dir String %DIR_STRING%

rem
rem These couple of lines work out what the last
rem updated file is. The dir command orders by
rem the updated date. The /S means do a recursive
rem search. This is there to ensure that the dir
rem returns the full path name rather than to
rem actually search a directory structure.
rem
rem We would get unexpected results if the
rem dir_string does actually match
rem a directory.

set LAST_FILE=
for /f %%I in ('dir /B /N /S /OD %DIR_STRING% ') do set LAST_FILE=%%I
echo Last file is %LAST_FILE%

if "%LAST_FILE%"=="" exit

rem
rem Retreiving the date stamp of the file for display

for /f "tokens=1,2" %%I in ('dir %LAST_FILE% ^| findstr /V "Volume Directory ( ^$"')  do set DATE_CREATED=%%I %%J
echo Date created is %DATE_CREATED%

echo Going to output this file:
dir %LAST_FILE% | findstr /V "Volume Directory ( ^$"
echo -----

rem
rem Echoing line by line rather than simply using
rem 'type' because I wanted to prefix each line
rem with the files datestamp. The reason for this is
rem because I don't want there to be any confusion
rem as to what file is being looked at, either
rem because this script has failed or because
rem the file you're expecting hasn't actually
rem been created

for /F  "delims=" %%I in (%LAST_FILE%) do echo %DATE_CREATED% Log: %%I

exit


Moving to netbackup - its case sensitive....

We’re currently migrating our backups from TSM to netbackup.

I know TSM quite well, but I’m learning netbackup.

The first lesson was that its case-sensitive on Windows:

bplist  -R d:\backup
   EXIT STATUS 227: no entity was found

fixed by

bplist  -R D:\backup
   D:\backup
   D:\backup\MTP1
   D:\backup\MTP1\DB_MTP1_5_1_605982503_05I1T3P7_1_1
   D:\backup 

Missed email alerts because 'Sent' date missing

I scripted our backup jobs to send an email to me if there’s a problem. This works well enough, usually.

However this week the backups started failing because of a dodgy disk, but it seemed I wasn’t getting the alerts. It wasn’t until one of the sysadmins mentioned the problem with the disk that I found out there was a problem. I made a mental note to check out why I hadn’t been alerted but, so far, hadn’t got any further than that.

As it turns out the alerts weren’t failing, but were being created with no ‘Sent’ date, and for some reason I must have chosen to display the ‘Sent’ time rather than the ‘Received’ time in Outlook, and to sort by it.

So, the email alerts were coming in to the bottom of my Inbox where in the normal course of events I’d never see them. Doh!

I need to implement some sort of positive check on the success of the backups - in previous jobs I’ve set up a directory listing of the directory with the backup files in and incorporated that in a daily check. I’ll probably do the same again.

Missed email alerts because 'Sent' date missing

I scripted our backup jobs to send an email to me if there’s a problem. This works well enough, usually.

However this week the backups started failing because of a dodgy disk, but it seemed I wasn’t getting the alerts. It wasn’t until one of the sysadmins mentioned the problem with the disk that I found out there was a problem. I made a mental note to check out why I hadn’t been alerted but, so far, hadn’t got any further than that.

As it turns out the alerts weren’t failing, but were being created with no ‘Sent’ date, and for some reason I must have chosen to display the ‘Sent’ time rather than the ‘Received’ time in Outlook, and to sort by it.

So, the email alerts were coming in to the bottom of my Inbox where in the normal course of events I’d never see them. Doh!

I need to implement some sort of positive check on the success of the backups - in previous jobs I’ve set up a directory listing of the directory with the backup files in and incorporated that in a daily check. I’ll probably do the same again.

Changing the DOS prompt

I’m mucking about with VSS from the command line.

To do this you need to be in the VSS directory, which looks like this, for me:

     C:\Program Files\Microsoft Visual Studio\VSS\win32>

This doesn’t leave much room for typing before the line starts wrapping.

This command:

    prompt $d$g

Sets the prompt to:

    31/08/2006>

Which is a bit better….

Changing the DOS prompt

I’m mucking about with VSS from the command line.

To do this you need to be in the VSS directory, which looks like this, for me:

     C:\Program Files\Microsoft Visual Studio\VSS\win32>

This doesn’t leave much room for typing before the line starts wrapping.

This command:

    prompt $d$g

Sets the prompt to:

    31/08/2006>

Which is a bit better….

How to create a spool file in the format <specified prefix>_databasename.lst

This is relatively handy, particularly when running the same thing against multiple databases. I’ve saved this in a script called spool.sql, so I can type @spool to start spooling to a file which has the database name in the middle

undef prefix
column name new_value dbname
select name from v$database;
spool &&prefix._&dbname
prompt Spooling:
host dir /b &&prefix._&dbname..lst 

How to use sqlplus substitution variables

I didn’t know you could do this. Its quite handy!

SQL> column name new_value dbname
SQL> select name from v$database;
XXX1
SQL> spool test_&dbname
SQL> select sysdate from dual;
03-JUL-06
SQL> spool off
SQL> host dir /od *.lst
03/07/2006  13:16                61 test_XXX1.LST
SQL> 

Query to compare parameter settings between databases

Create a link to the remote database then supply the name of the link as &1

set pagesize 1000
col name format a28
col local format a20
col remote format a20
select local.name, local.value local,  
                   remote.value remote, 
       substr(local.isdefault, 1,1) || '/' ||
       substr(remote.isdefault,1,1)
from v$parameter local,
     v$parameter@&1 remote
where local.name=remote.name
and   lower(local.value)lower(remote.value)
order by local.name
/ 

SQLserver 2005 New Features

Here is some stuff I learnt yesterday evening’s SQL Server 2005 New Features session at Microsoft in Reading

Enterprise Manager

  • Enterprise Manager has been re-hashed, and re-named Management Studio.
  • It didn’t seem radically different in terms of what it does.
  • It will work with previous versions
  • Unlike Oracle’s offering, it isn’t browser based

Osql - Now called sqlcmd

  • It has definitely been much improved.
  • There are a lot of command line switches.
  • You can now pass in parameters from the command line. Like Oracle
  • I’d be interested to see it would work with sql 2000. And sybase

Security

  • The bloke said that S2K had been ’re-architected' - I couldn’t really see this at all
  • The concept of schemas (or schema owners) has been separated out from that of users. I think in practice this just means you have a ‘special’ type of user for owning the objects.
  • There is supposed to be better levels of encryption.
  • SAC - see below

Surface Area Configuration

  • Embodies concept of trying to reduce the ‘surface area’ of the database i.e. the amount of stuff you’ve got installed
  • There’s a specific GUI to control this, and (interestingly) a very full-featured command line option

Other

  • There’s a database mail tool. I couldn’t see much advantage to this other than not having to install the Outlook client
  • There’s a concept of database snapshots. You ‘take’ a snapshot of all the data, then can optionally go back to it in an Oracle-flashback way
  • Due to changes in the licensing laws, they no longer give out free beer

SQLserver 2005 New Features

Here is some stuff I learnt yesterday evening’s SQL Server 2005 New Features session at Microsoft in Reading

Enterprise Manager

  • Enterprise Manager has been re-hashed, and re-named Management Studio.
  • It didn’t seem radically different in terms of what it does.
  • It will work with previous versions
  • Unlike Oracle’s offering, it isn’t browser based

Osql - Now called sqlcmd

  • It has definitely been much improved.
  • There are a lot of command line switches.
  • You can now pass in parameters from the command line. Like Oracle
  • I’d be interested to see it would work with sql 2000. And sybase

Security

  • The bloke said that S2K had been ’re-architected' - I couldn’t really see this at all
  • The concept of schemas (or schema owners) has been separated out from that of users. I think in practice this just means you have a ‘special’ type of user for owning the objects.
  • There is supposed to be better levels of encryption.
  • SAC - see below

Surface Area Configuration

  • Embodies concept of trying to reduce the ‘surface area’ of the database i.e. the amount of stuff you’ve got installed
  • There’s a specific GUI to control this, and (interestingly) a very full-featured command line option

Other

  • There’s a database mail tool. I couldn’t see much advantage to this other than not having to install the Outlook client
  • There’s a concept of database snapshots. You ‘take’ a snapshot of all the data, then can optionally go back to it in an Oracle-flashback way
  • Due to changes in the licensing laws, they no longer give out free beer