The Secret Internet Protocol Router Network: the “Secret” Internet

Among the many things WikiLeaks has outed is the existence of the SIPRNet, or Secret Internet Protocol Router Network, used by the U.S. Federal Government to exchange confidential information. Unfortunately, “confidential” came to mean about half a million users worldwide, among them the alleged WikiLeaks leaker Bradley Manning who allegedly, famously dumped thousands of documents in Julian Assange’s lap.

With the term “router” in the name, this network sounds like a parallel Internet, not a VPN-like operation. That means dedicated lines, private communications and an interesting DNS manipulation that layers additional top-level domains onto the existing ones. reveals a few facts about users and exposure at

Popular Mechanics has a good “consumer-level” article at, light on the network protocols and a little heavier on the non-confidential details.

Also see the Wikipedia article on this former secret at

While I find it unsurprising that the feds would operate a secret network like this, I am impressed at just how un-secret it is. One could practically guarantee that this isn’t the whole story.

The Roots of Computing: the Sinclair ZX81

Okay, I’m the first to admit I’ve been working with computers for a long time, a really long time by today’s standards. My first experience involved punchcards and a DEC PDP8E, Cobol and Fortran in 1975. The fussiness of the technology left me cold, but since those classes I’ve always had a sort of hot-rodder’s obsession with computing hardware. Should I buy a dedicated word processor, or a full-on computer? Could I build my own? Kits were common back then, so that wasn’t a ridiculous question. But the minute I got my hands on my first 8086 (thank you forever, Oscar Boyajian!) a world opened up that has been growing at astonishing speed.

People who already were addicted to coding in the early 1980s gravitated to several emerging home-computer platforms like Atari. The Sinclair units had gone through several iterations, but Sinclair hit the first sweet spot with the ZX81. It became a consuming (and annoying) habit for teenagers who would later be tagged as geeks, then even later as bosses.

Now comes this great article from the BBC.  Check it out here:

Using Backtrack: Network Mapping: Identify Live Hosts: NBTScan



Given an IP address range or subnet, nbtscan specifically returns NetBIOS names mapped toresponding IP addresses. Verbose output ( -v ) returns the entire cached NetBIOS name table from each responding Windows machine, which is a great way to map deeper into a network.

Note that this is a Windows-only scanner. Not that it runs only on Windows (though it will run there as well as on *nixes); but it only maps Windows-only NetBIOS names, not IP hostnames.

Home Page and Tutorial:

eicar Test Files

Here’s something to think about: how often do you test your anti-virus software to ensure that it’s working properly? Using the European Institute for Computer Antivirus Research (eicar) test files, you can test the response of your anti-virus and anti-malware software. I currently have a ticket open with eicar to see how often they update the test files. I will post that information when I receive it. In the mean time you can check them out at the link below and try their test files.


Here is the response I received:


there is no update needed, because the EICAR antivirus test file becomes
an industry standard. Please read the information on our website.

Best Regards,
Marc Schneider


Stuxnet: A Declaration of Cyber War

One of the most insistent voices in cyber security (I invite you to tell me just who) has been very clear that he (that’s a clue) thinks the metaphor is wrong, and potentially disastrously wrong, when we talk about “cyber warfare.” War, he insists, is not the same as what crackers do.

Except that now, maybe it is.

Like most security people, I’ve been watching the Stuxnet story closely since word emerged into the open about this strange virus that targets industrial control systems.

Bruce Schneier took note on his blog at

The German Langner group blogged extensively, for instance at

And it became very clear this was as different from a “normal” piece of malware as the sun is from a candle. Now faithful reader Herbbie R. links me to a great Vanity Fair article at READ THIS ONE! Then I’d suggest a stiff drink.

Security? What security?

It’s been a hellish couple of weeks, because I took over a night class for a colleague who had a death in the family. Working 9-9 for any number of days will make one week/weak. Glad to be past that.

But what do my many active contributors feed me during this time? Good lord.

First, faithful follower Herbbie R. sends me this link about the private BSD/Linux distributors’ security email channel. Apparently this channel is no more, useful as it may have been in keeping the makers and distributors at least caught up with the curve, because somebody broke in and monitored those emails. You’da sorta thunk those BSD/Linux gurus would have their security down tight! But I’ve been hacked myself, so I won’t dig too hard. Read about it here:

Speaking of insider emails, I received an email that sorta-kinda looked like the ones I get from an organization to which I belong. I urged me to click all kinds of links plainly labeled with their destination (like my link just above). The difference is that if you hover over my link, you’ll see its destination is exactly what I say it is. This particular email’s links revealed very different destinations than they claimed. I fired off a warning to the organization, but later found the email was in fact valid. Safe? Trustworthy? No, but valid; the organization uses a third-party polling service for some of their activities, which was responsible for the misleading links.

You will be safe in imagining my response to learning this. Let me just say this: It is a serious violation of the Trust (I capitalize on purpose) I place in you, for you ever to send me misleading links. This is not a forgivable offense; it is a practice about which I most emphatically warn my clients. Misleading links are, in the vast majority of cases, pure evil. Asking me to click your misleading link just makes me really mistrust you. Is this really what you want?

And finally, about the onset of “Cloud Computing”: which makes me think of the weird Windows commercial of the couple trapped at the airport, who Remote Desktop into their home PC to watch recorded TV. As the woman says, in a strange faint strangled voice, “Yay cloud.” No, that’s not the cloud. That’s just scary: they’re still trapped in the airport!

Those of you who know what the cloud is, and are taking advantage of it, will be glad to read the article Cloud Computing Elevates the Role of IT at Read it and see the next great area of IT job demand. I’ll see you there.

Using Backtrack: Network Mapping: Identify Live Hosts: fping



Fping does a “fast ping” of a list of hosts. It’s set up as a scripting-friendly tool, with output that’s easy to parse. Supply a list of target IPs at the command line, or use an input file. Then fping will (very quickly) ping each IP in series without waiting for a response. If a host responds, it’s up and ready to exploit.

Take note of the very nice Perl script example on the man page.


Network Mapping: Identifying Live Hosts

Home Page:

Man Page:

Regular Expressions and Special Variables

Special Variables

$_ is the “default input and pattern matching” variable; the default input is often the current line of a file

@_ is the list of incoming parameters to a subroutine

See Well House Consultants

$. is the current input file

$$ is the current process ID

$^O is the operating system (that’s an “Oh”)

$#_ is the index number of the last parameter

A basic pattern-matching loop

while ($my_var = <MY_FILE_HANDLE>) {

if ($my_var =~ /search_pattern/) {

# Notice that =~
# It’s the “search/match” operator

# Also, use those / / characters for
# MUCH faster operation!
print MY_FILE_HANDLE $my_var;
# We just printed to the file

# Alternately, just dump the line:
print $my_var;


If you run this loop but don’t name a loop variable, $_ is already waiting for you:

while (<MY_FILE_HANDLE>) {

/search_pattern/ and print MY_FILE_HANDLE ;

print ;

@_ # The array of incoming parameters supplied to a subroutine.

# The whole array

# The first element of the array

# The second element

# This one’s odd: it’s the index of the last element (which is not quite the same as the count, because this is a zero-based array).

sub call_me {
print “Element zero is ” . @_[0] . “\n”;
print “There were ” , $#_+1 , “\n”;

use English;

This is a pragma.

Allows addressing @_ as @ARG

Allows addressing $$ as $PID


This is why we learned about =~.

$my_string = “I’m hard at work.\n”;

if ($my_string =~ /work/) {
print “He’s working.\n”;




# matches any single digit

# matches any letter, digit or the underscore

# matches any space (white space): space, tab, \n, \r

Capitalize any of the above to invert its meaning.

# Beginning of line or string

# End of line or string

# Generic wildcard character: matches any ONE character
# so /x.z/ matches x1z, xSz, x-z, etc.

#Preceding character match: matches the preceding character ZERO OR MORE TIMES

One use of * is with the dot character, when any number of any characters could appear at that position:
Matches “this followed by anything.”

# Preceding character match: matches the preceding character ONE OR MORE TIMES

# Preceding character match: matches the preceding character ZERO OR ONE TIMES

Create groups of optional characters with parentheses:



Combining expressions



Character Classes

# matches any of q, w, e, r, t or y

# DOESN’T match any of q, w, e, r, t or y
# Be darn careful where that ^ is.



# case-insensitive

# search; replace; global
# g also tells Perl to return to its last position
# in the string on the next iteration



if ($_ =~ /heck|darn|dang|fooey/) {

print “This mild cussing is present in this line: $1.\n”;

The $1 variable holds the string that produced the match. If the match was “heck” then $1 = “heck”. If you have two subexpressions, you’ll have $1 and $2, and so forth:

$singer = “Wendy Wall”;
$singer =~ /(\w+) (\w+)/;
# $1 holds “Wendy” and $2 holds “Wall”


Search and Replace

$sentence = “This is the usual cat and dog example. It mentions two cats.”;
$sentence =~ s/cat/dog/g;
print $sentence;



The formal Perldoc –

Perl Matching With Regular Expressions – a long page with very good detail – – with many good examples – – with more, and detailed, examples –

Regular Expression Reference – useful, concise and highly recommended –


A line from an Apache log file looks like this: - - [01/Nov/2000:00:00:19 -0400] "GET /news/home/index.htm HTTP/1.1" 200 2285

So let’s hack on this analyzer, called


# We have to supply the log name as the first command argument
$logfile = $ARGV[0];

unless ($logfile) { die “Usage: <httpd log file>”; }


sub analyze {
my ($logfile) = @_;

open (LOG, “$logfile”) or die “Could not open log $logfile – $!”;

while ($line = <LOG>) {
@fields = split(/\s/, $line);

# Make /about/ and /about/index.html the same URL.
$fields[6] =~ s{/$}{/index.html};

# Log successful requests by file type. URLs without an extension
# are assumed to be text files.
if ($fields[8] eq ‘200’) {
if ($fields[6] =~ /\.([a-z]+)$/i) {
} else {

# Log the hour of this request
$fields[3] =~ /:(\d{2}):/;

# Log the URL requested

# Log status code

# Log bytes, but only for results where byte count is non-zero
if ($fields[9] ne “-“) {
$bytes += $fields[9];

close LOG;

sub report {
print “Total bytes requested: “, $bytes, “\n”;

print “\n”;

report_section(“URL requests:”, %url_requests);
report_section(“Status code results:”, %status_requests);
report_section(“Requests by hour:”, %hour_requests);
report_section(“Requests by file type:”, %type_requests);

sub report_section {
my ($header, %type) = @_;

print $header, “\n”;
for $i (sort keys %type) {
print $i, “: “, $type{$i}, “\n”;

print “\n”;