Microsoft ASP.Net OOB Patch

sorry to interrupt the flow of the Making the Web Work for You series, but this is somewhat important. Microsoft issued a patch last week for the outstanding .Net issue that could pose a significant threat to those of you with Internet facing IIS servers. Although all .Net systems are vulnerable, the affected IIS boxes do pose the greatest risk for exploit. For some reason the OOB (out of band) patch is only available through MS Download Center.

http://www.microsoft.com/technet/security/bulletin/MS10-070.mspx

Making the Web Work For You – Part 3

Site Freebies

Now that we have your site up and we have taken some baby steps toward getting some visitors to your site, it’s time to manage and analyze the site.  As luck would have it, there are plenty of tools for just these things and better yet, these tools are free.

There is an entire toolbox at your disposal at http://www.google.com/webmasters/

The first of these useful items are the webmaster tools which can do site diagnostics and basically give you an indication of how the site is working, which search terms are bringing people to your site and even show you if malware has been found on your site.  Overall site performance (page load times) are also indexed and graphed.

The other tool is Google Analytics, which does require you to add a snippet of code to your pages, but the statistics gathered are very useful.  Analytics gives you tremendous insight into your visitors, how they are getting to your site, how long they stay and what they view.  With this data in hand you can hone your site, tailor it to suit your visitors needs and maximize it’s potential.  You can also set “goals” – pages you want visited or actions you want performed (like filling out a request for quotation) and tune your site to funnel visitors toward those goals more effectively.

This is all very Google-centric and there now appears to be similar tools for Bing – but it is probably a better idea to tackle the Google side first due to it’s dominance in the search market.  If you feel ambitious, look into doing something similar with Bing and Yahoo!

This is not a set and forget kind of thing, you should expect to check back frequently to monitor how your site is doing.  You can also have Analytics send you scheduled emails to save you from visiting the site.  It also not a bad idea to do the malware check regularly – it’s not always readily apparent to the site owner when your been compromised.  All in all these free tools can help you or your webmaster really tune-up your site and give you far more information that just page hits.

Making the Web Work for You – Part 2

Bringing in the Traffic

In part one of the series, I talked about getting a site and doing it right.  Now what?  Well if you have a site and you are content with the form and function, it is likely a good time to find ways to become noticed and drive traffic to your site.  While I often suggest using other media to announce your web presence – print, radio, etc. – today I’m sticking just to using the web and the ever-present search engine.  One of the main goals for your site is to have a good page ranking – this means being listed in the top results when a potential client does a search using keywords that relate to your business.  This general process is known as Search Engine Optimization – SEO, which is equal parts science and art.  There is also no lack of people claiming to be SEO experts and will gladly take your money and deliver questionable results – take due care.

First your site needs a little prep work, you need to take care of three things right off the bat – your Meta Tags, SiteMap and robots.txt.  All these might be something your web developer will take care of (and they should); but it does not hurt to know what they are all about.

Meta Tags are pieces of HTML code – hidden to the casual observer – that hint to the search engines what your site is all about.  These hints help your site get properly indexed, so when someone does a search for “antique watch insurance southern Ohio” your site gets listed, preferably on the first page of results.  The two key meta tags for this are called description and keywords – the first is a brief description of your site or that particular page, the latter is a collection of keywords that relate to the content, essentially these are the search words that should bring a potential client to your doorstep.  Here is what they look like in the page source code: http://www.w3schools.com/tags/tag_meta.asp
A sitemap is an xml file – sitemap.xml, that resides at the document root on the webserver – it is a file that search engines can use to understand the layout and content of your site – it is a guide or map that crawlers refer to.  Since many websites have dynamic content, a good sitemap helps to reveal that content and the sites true hierarchy which would otherwise be hidden from the crawler and therefore not indexed.  Sitemaps do not have to be handcrafted, there are free generators available to simplify the task –   http://www.xml-sitemaps.com/
Finally, robots.txt; now that you want to entice the various web crawlers, spiders and bots to come for a visit, you may also want them to play nice and index that which you want indexed and possibly ignore other content.  The robots.txt file gives them basic instruction on what part of the site is okay to crawl and what is off limits.  Keep in mind – not all crawlers are respectable and they can ignore robots.txt – dealing with them is a whole other story though.  Again this file goes into your web document root directory.  Using robots.txt is not a security measure and does not ensure the privacy of content and should not be used with that goal in mind.  More info – http://www.robotstxt.org/robotstxt.html

With those items taken care of, now it’s time to submit your site to some major search engines.  This is basically the process of alerting the various search engines to your site’s existence by submitting the URL to their various services:

http://siteexplorer.search.yahoo.com/submit
http://www.google.com/addurl/?continue=/addurl
http://www.bing.com/webmaster/submitsitepage.aspx

Do not expect instant results, but your site will be crawled and indexed, and good formatting and content along with proper meta tags, sitemaps and a robots.txt should help your page ranking.  That about wraps things up, I hope you can put some of this to good use on your site.

Making the web work for you – Part 1

The Working Web

I’m going to do a series on leveraging web technologies to compliment your marketing strategies, promote your products and services, and communicate with your clients.

It all starts with your website – which for some is merely a billboard or worse yet, a business card – those days are far behind us now.  Today your website is a huge part of your branding effort and is the center of your marketing universe.  Many people claim to be web designers, but many of them have precious little understanding of branding, worse yet, some may not have a good understanding of web foundations like HTML, CSS, server-side scripting, browser support and security.   I cannot stress enough the importance of getting good people to design or update your site.  A template-based site slapped together on some WYSIWYG editor will get you on the ‘Net quickly, but cheap is obvious, even to the layman and that is not the first impression you want to be giving your visitors.

There is no question, that in this industry a website is a necessity – and any business website needs both aesthetics and function.  If you do not have a site, it’s time to analyze the business need for one and if you have a site, review it and determine if it is doing everything you want it to be doing.

As for the mechanics and technology – you can host the site internally if you have the infrastructure to support it, or you can choose to use the multitude of external hosting options out there.  I would recommend using a quality hosting company, but that could be a local ISP or a large outfit like 1&1.

A couple recommendations – if you don’t know where to start:

Carve Design – a studio specializing in brand development

Kelly King Design – Kelly specializes in launching working brands

In Part 2 – I will be talking about some free tools every website should consider deploying.

Windows LNK vulnerability

There has been quite a bit of talk in security circles with regard to the latest 0day Windows LNK (short-cut) vulnerability, which has potential to be fairly serious.  There are partial fixes and workarounds but not a complete patch as yet.  The following links should help you get informed and cover your bases:

http://www.sophos.com/products/free-tools/sophos-windows-shortcut-exploit-protection-tool.html

http://www.sophos.com/security/topic/shortcut.html

NO IP for you

http://technology.canoe.ca/2010/07/26/14833401.html

340 days worth of IPv4 address space left – can IPv6 save the day?

Gone will be the familiar 4 octet (32-bit) addresses like 123.123.123.123, replaced instead with something like 2001:db8:85a3::8a2e:370:7334 – hexadecimal 128-bit.  Will certainly take some getting use to.

IPv4 address space exhaustion can be blamed on many factors, the exploding accessibility of the Internet,  the increasing use of small mobile devices (iPhone anyone?), and poor scavenging of huge pools of address space, sometimes snatched up by crooks for nefarious reasons.

http://en.wikipedia.org/wiki/IPv6

Mind Your Tapes

Most businesses still rely on the venerable tape drive and tape media for backup and archiving, but what is your strategy for those tapes when not in the drive?  Bare minimum is to rotate tapes offsite to prevent a building fire or other disaster from wiping out all your data.  Some people will take them to their personal residence, but there are also storage services that will pickup and drop-off tapes according to a set schedule and even securely store multiple copies – perhaps monthly and yearly archives, for a fee.

Some popular rotation techniques:

Grandfather, father, son

Tower of Hanoi

Also remember to test your backup by doing a full restore periodically – the tapes are not going to do you any good in an emergency if the data is not there as expected.

Does this file taste funny to you?

Reminds me of a joke –

Q:  Why don’t cannibals eat clowns?

A:  They taste funny.

Of course, this entry is not about cannibals, clowns or peculiar appetites – it’s about what to do when you find a suspicious file on a machine, especially if that machine has been acting strangely and you think something untoward might be afoot.  Locally installed antivirus not giving you any hints?  Well, if you have isolated a suspicious file or two here is what to do – visit http://www.virustotal.com and upload your funky files – let their service scan those files with 40 some-odd AV engines.  This will give you two things:

1) usually an answer as to what that file may be

2) the creeps, because you will soon realize just how poor AV detection rates are!

While VirusTotal is not going to clean anything up for you, it will let you know whether or not you need to pull your wonky host off the network and start cleaning, or as is the considered best practice these days – re-imaging.

Intrusion Detection – not just for the enterprise

Intrusion Detection can really be a variety of technologies – NIDS, IPS, HIPS (Network Intrustion Detection, Intrusion Prevention System, Host Intrusion Prevention System).  The difference between these is pretty straight-forward, NIDS uses a sensor or sensors to monitor network traffic and alert on anomalies, detection is usually signature-based.  IPS is a NIDS setup that is inline with your Internet feed, this allows your IPS solution to actively block attacks.  Some firewalls or UTMs have some IPS abilities, other IPS solutions are dedicated boxes.  HIPS is an software solution that runs on endpoints (workstations, notebooks), the detection is usually behavioural-based, HIPS can be considered a last line of defense and is sometimes a component of modern endpoint security suites.  HIPS, due to it’s interaction with the system at a fairly low-level, can have adverse effects like stability and performance issues.

At AppliedUsers HQ (Parallel42) we use Snort in a NIDS configurable, although it is fully capable of being an IPS as well.  I prefer NIDS over IPS because the ever-changing security landscape can make IPS management quite a chore.  There can be significant tweaking of configs, signatures, thresholds and alerts to really get an IDS tuned – so in an IPS configuration you could be blocking traffic you want to let through.  Snort is open source, free and often considered the de facto IDS.  Detection rules are available from SourceFire as well as other sources (EmergingThreats) and you can write your own fairly easily.  We have Snort running with BASE, which is a web front-end to display and manage the alerts.  You can also combine Snort with Barnyard2, Squil or ACID.

An IDS can do so much more than just detect intrusion attempts – I also manage a Snort install in an enterprise environment and often it detects malware before our antivirus.  It can also be used to detect network policy violations such as connections to certain websites, P2P traffic, IM traffic, porn – pretty much anything you can think up.

I think almost every network should deploy an IDS of some kind, a Snort solution is free but requires a little expertise or willingness to RTFM, but it pays off huge dividends in securing your network.  There are a good number of drop-in solutions as well from your typical network vendors like Cisco, WatchGuard, etc.

SNORT

BASE

SQUIL

BASE