Confessions of a Lazy Blogger

by admin 27. March 2010 05:33

Author: Laurence Sterne I'm never short of plans for the future of this blog, but a cursory glance at the dates of the last few posts reminds one of "the best laid plans of mice and men"...

Every so often I see other bloggers explaining to their readers how they got waylaid and would be more dutiful in their future blogging efforts. But what gets me is that more often than not they were never committed and more likely underestimated the real time demands of maintaining a blog. My excuse is that I have been taking on extra projects without stopping to ask myself what gives me the most enjoyment. I have to admit that I miss blogging on a regular basis and won't rest easy until I get back into the swing of it.

I'm also having problems with the version of BlogEngine that I'm currently using, in that the invisible CAPTCHA is seriously flawed. Right now I'm being spammed to death and I hate not responding to people who are good enough to contribute in the comments section. My plan has been to upgrade to 1.6 but I've also been pondering the use of a different blog engine.

I've been having an internal argument for the last month on whether to switch my focus to MVC; it wouldn't be suitable for our projects at work and I'm loath to invest more time in it myself right now because I'm finally throwing every spare moment into improving my JavaScript/JQuery skills (or lack of). There's a part of me that just knows that MVC is the proper way to build Web applications. However, right now it is just not practical for me to make the switch. In time, I will definitely move in that direction.

So rather than make a list of what I will do for my new blog design, I'm just going to take the first small step when I finish this post: I will download the latest version of BlogEngine and have it ready to go in the morning! First order of business will be to widen the layout to 960 pixels and decide on how radical a re-design this will actually be overall. Thanks to the 7,378 unique visitors over the last month for your patience and loyalty :-)

Tags:

Blog




SEO The last thing I ever wanted to be was an SEO snake oil salesman. But as an ASP.NET Web Devigner (Devigner = Developer + Designer), it's not something that I tend to ignore. I recently took on a project to improve the SEO of our local Tourism website. I'm not going to delve into the project details here, but suffice to say that a project like this can offer much insight in normal times but I probably couldn't have picked a worse time to decide on SEO strategies for a website. The last few months have seen radical shifts in SEO priorites in general, and Google's algorithms in particular. Don't forget: SERPs (Search Engine Result Pages) now return real-time results from social networking sites such as Twitter - more on the implications of this below.

Against this backdrop, I decided on an initial analysis of the site using the new IIS SEO Toolikit - get this tool and use it! This identified about 650 no-no's so I spent a week eliminating these one by one and wrote some code to take care of the meta tags and the like. The most important decision I made was to agree with the client to monitor the site for SEO hits, good or bad, for the coming six months. It behooves any contractor to take this course when they know in advance that they may not know anything at all!

IIS Toolkit

 

Google had no choice but to find some way to reduce the amount of spam clogging its data centers and apply some qualititative heuristic to measure the relevance of sites.  So, they recently re-wrote their entire algorithm (codename "Caffeine") which caused no end of panic among the SEO heads! To this end, page rank seems to be playing a much smaller part than ever before. And whatever small part it is playing will be very much influenced by a site's performance. In fact, performance is going to figure heavily in how well a site fares with Google overall. I can see myself getting more involved in this since it is going to effect clients' pocket books in a very discernible way - my prediction for 2010!

WHAT'S GOING TO MATTER

* Personalization
Search If you're signed in or not, Google can use your search history to tweak the relevance of your own searches. Signed in, you can opt to turn it on or off. Signed out, a cookie records your search history for 180 days. I'm not a big fan of this because I want my results to be the natural consequence of my ability to creatively grep precisely what it is I'm looking for. But that's just me and I can readily see how this step is necessary for Google to provide "meaningful" results to people. Personalization lends even more credence to the diminishing importance of page rank.

* Conversions
This is the number of successful transactions divided by the number of total unique visitors. Think of an E-Commerce site where you can use advanced Google Analytics tools to measure conversion rate formula as the number of sales divided by the total number of unique visitors. Check out Google's Conversion University.

* Universal Search
Remember, that search results now include video, images, blogs, books. I have been running some tests for the blog results and my impression is that the big sites with large traffic are just getting stronger. Even entering the title of my blog (The ASP.NET Community Blog), does not show me in the first ten pages of results! I've seen other developers complaining of a lack of transparency - but then again, we're talking about search algorithms which are as tightly guarded as a duck's arse and that's watertight. No surprise there, but it's still unsettling because the cause and effect of SEO tweaks seems to be even less predictable now.

THE REAL IMPLICATIONS

1) Sites with basic SEO errors will be penalized.

2) Sites with poor performance will be penalized. If you stop and think about it, there must be a huge increase in the amount of content that Google has to index in light of real-time results pouring into their data centers every second. Something has to give. Check out the peformance of your site using the Google-recommended WebPageTest application.

UPDATE

Best Practices for Speeding up your Website

kick it on DotNetKicks.com

Tags: ,

Google | SEO



Twitter - Saving your Tweets

by agrace 19. December 2009 19:44

Follow me on Twitter In my ever ending search for a tool/technology to manage my inflow of information from the Web, I opened a Twitter account that could also act as a set of pseudo-bookmarks. My IE8 and Firefox bookmarks already stretch to the floor when expanded. Plus, I'm convinced that there must be some unknown aspect of HCI that dictates that once something is saved as a bookmark, it is never opened again. Bookmarks drop downs are about as user-friendly as crotch rot. I've looked at Evernote and OneNote, the latter being the most promising to date. I plan to take another look at the products from Microsoft Live Labs.

Recently, I passed the 700 tweet mark and decided to archive what I had. But how? I gleaned the following tidbits from "Twitter Tips, Tricks, and Tweets" by Paul McFedries - founder and CEO of Mashable. You can save a local copy of your tweets by entering the following URL in your browser:

http://twitter.com/statuses/user_timeline/account.xml?count=n

where account = your account name, and
n = number of tweets

 

Tweets XML

 

This will open n number of your tweets to date in XML in your browser. From here, you can import your saved XML file into Excel 2007 as follows:

1) In Excel, click on the Data tab

2) Data -> From Other Data Sources - From XML Data Import

3) Click a cell where you want to import the data to

4) Save as, using the id of the last tweet

Save Tweets

 

Tweets URL

 

The reason I include the last Tweet id# is to make it easy to archive next time, starting where I left off and using the following syntax:

http://twitter.com/statuses/user_timeline/account.xml?since_id=6839319097

 

If you look at the Excel sheet you will see that the last tweet had an ID of 6839319097. This will download an XML version of all my tweets since then. Now I have a proper archive of my tweets that I can search on :-)

From here, you might want to take a look at the TweetSharp API - a complete .NET library allowing you to build Twitter applications in C# and .NET 3.5. I've started reading a great book on the topic: "Professional Twitter Development with Examples in .NET 3.5" by Daniel Crenna.

kick it on DotNetKicks.com

Tags:

twitter