January 26, 2007

Google drops bombs

Link: Google drops bombs

Filed under: Internet, Search Engines, Webmaster, Google, Web Development by Brian Turner


Google has implemented a change to it’s search ranking algorithm, which removes existing Google Bombs.

Google Bombs - also known as Link Bombs - were primarily set up by campaigners looking to make a political statement using Google.

For example, when bloggers decided to set up a campaign against George W Bush, they all set up links for the word “miserable failure” in links pointing to the Whitehouse site.

Because of the way links form a fundamental tool of communication on the web, and because Google normally gives them some degree of weight when ranking search results, it meant that the Whitehouse site would rank top for a Google Search for “miserable failure”.

Although Google has previously dismissed these as “pranks”, the situation has become acutely embarrassing for Google, which has been seeking increasing political influence on Washington’s Capitol Hill.

This is not least because Google has been accused of validating these political statements by continuing to allow them to rank top.

According to an official statement on the the Google Blog, Google much prefers to apply automated methods to reduce the impact of undue or unwelcome influences in Google.

While the impact of removing the Google Bombs is unlikely to have an impact on the overall search user experience, it appears that Google considers this a chapter in it’s history now closed.

Concerns over Google & Mozilla anti-phishing

Link: Concerns over Google & Mozilla anti-phishing


Concerns have been raised about the Google & Firefox anti-phishing drive, after it was revealed that the publishing of raw data online contained usernames and passwords of phishing victims.

Reports that personal data was being published was raised as far back as September 2006, but so far changes have not apparently been made to address the issue.

It’s not the only problem raised - there have been additional reports of mistaken identity, where even bonafide banking sites have been highlighted as phishing sites.

Overall, despite privacy concerns previously raised, the anti-phishing initiative between Mozilla and Google remains a laudable initiative.

However, the reports highlighted show that there are very important changes required to ensure the protection of users from having their login information when phished published openly online.

Additionally, it may be useful to see a whitelist of acceptable banking domains being introduced, to help reduce the risk of warnings of false positives.

This latter point should be especially picked up by Microsoft, whose recent security “improvements” to Internet Explorer 7 means that it cannot tell the difference between routine use of security certificates and abusive use of domain names.

Webmasters frequently require use of shared security certificates, such as logging into CPanel, or logging into Adsense via Google UK rather than Google.com.

However, in both types of instances, routine use of shared security certificates will cause IE7 to issue a very stark security warning. Not only is the warning plainly wrong, but crying wolf may also desensitive users to actual phishing attacks.

January 19, 2007

1and1 suffers technical failures

Link: 1and1 suffers technical failures

Filed under: Internet, Webmaster, Webhosting, Web Development by Brian Turner


Webhosting company 1and1 suffered a series of technical problems this week that has left customers unable to receive emails.

The problems began on Monday, and although 1and1 claimed to have fixed the original issue, serious problems are still ongoing.

Customers of the webhosting company have been desperately trying to find out more information in online communities on how to access any form of normal service.

However, at present, they remain frustrated and unable to properly access emails through the week.

1and1 is a company with a chequered reputation - although they have aggressively marketed themselves and pushed hard on delivering low price, the result has been low quality of service when required.

Complaints about customer support are common, with telephone support offering little useful help, while support emails can be sent without being answered at all.

Meanwhile, we can only hope that 1and1 is able to restore at least a basic level of service for affected customers, for their sake.

ADDED: 1and1 have provided Platinax with the following statement on issues so far:

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Following an important update to our mail server software and the subsequent restart of the mail servers, there have unfortunately been delays in the delivery of incoming email to 1&1 mail boxes.

Email messages are stored safely in a queue, but are subject to delay.

Whilst on Thursday morning there was no delay, as of 11am today, Friday, there is an average mail delivery delay of 2 hours. Spam mails are being delivered with a lower priority and hence remain longer in the mail queue.

Many email users waiting for delayed mails currently access their mailboxes at a high frequency. Thus, in peak times there may also be delays in connecting to Inboxes.

The SMTP service (outgoing mail) is not affected and runs without delay.

Due to planned updates of software and hardware systems, some disruptions to email delivery to Inboxes may continue throughout today, Friday 19 January.

The delay in incoming emails will decrease as full service is gradually restored and emails are delivered.

We sincerely apologise to our customers for any inconvenience caused due to the queuing of emails and thank them for their co-operation and understanding while we continue to resolve these issues. As always, customers who would like to discuss any concerns are kindly asked to contact 1&1’s UK Complaints Dept.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

January 10, 2007

Government to kill websites

Link: Government to kill websites

Filed under: Internet, Webmaster, Web Development, Political by Brian Turner

Computers & Internet

The UK government is to cull over 550 of it’s existing websites.

The aim of the move is to concentrate more information into existing sites, to develop them into “supersites” - ie, websites that will be more frequently used and visited.

The government has been previously criticised over the number of websites it runs. In 2003, the UK government had over 3000 websites, something it accepted then was too many.

Additionally, there have been complaints of poor accessibility with the majority of government websites, with poor W3C validation that make them inaccessible to people with disabilities and those using browsers other than Internet Explorer.

At present, the UK government has over 900 websites, but has stated an interest in keeping only 26 for definite, with 550 scheduled for removal already, and the hundreds of others up for review.

January 3, 2007

Confusion over company law amendment

Link: Confusion over company law amendment

Filed under: Business, Internet, Webmaster, Ecommerce, Web Development, Legal by Brian Turner

Computers & Internet

The New Year has opened in confusion with regards to when and where a company must display corporate information online.

This comes after the EU First Company Law Amendment Directive was legislated into UK law for January 1st.

The question is - in what way exactly does it require changes in company communications?

For example, companies are required to display registration and registered office information on their websites, something many will already have complied with.

However, the situation becomes more confusing with online communications.

For example, will staff at limited companies need to display a company registration number and address in emails?

Although the initial suggestion is that this may be the case, the problem then arises where companies may reply to email lists, online communities, and forums, when speaking in an official capacity. This is not least because such groups may not always allow for such information to be posted.

According to the report at the Register:

Such information is already required on “business letters” but the duty is being extended to websites, order forms and electronic documents.

Additionally, website developers need to understand that the regulations may apply even to websites that do not even sell services online, directly or indirectly.

Which means that every website may need to display detailed contact and company registration details on any website connected with company operations.

With continued confusion over some of the more ambiguous elements of the ecommerce regulations, it remains to be seen to what extent these may be applied and enforced.

In the meantime, Out-law.com provides a useful guide on compliance with the recent changes in ecommerce regulations relating to company law:

The UK’s E-commerce Regulations
Email notices
The UK’s Distance Selling Regulations

November 15, 2006

Video viral marketing proves a marketing failure

Link: Video viral marketing proves a marketing failure

Filed under: Marketing, Web Development, IPTV, PPC by Brian Turner


Marketing Experiments (ME) - who recently acquired Marketing Shrepa - released a study that claimed that viral marketing via videos proved an incredibly cost-effective way to get conversions.

However, the study has proved to be controversially flawed.

In the study, the researchers spent $9600 creating 28 videos, which were then syndicated on YouTube, Google Video, and others.

Each video was simply a non-promotional video, with a link to a website at the end.

According to ME, over a 60 days period they videos were viewed over 300,000 times, and generated over 4,000 visitors to the targetd website, by clicking the link at the end of the videos.

1.49% of this traffic was then converted into newsletter subscribers.

ME then claim this as far more effective than PPC, citing PPC as requiring around $20 spend per subscriber.

The problem is, ME claim that the advertising and cost per acquisition (CPA) cost of the video marketing experiment was zero.

However, then failed to factor in their initial $10,000 spend on producing the videos.

Additionally, converting 1.49% of 4,162 visitors means they only got 62 newsletter subscribers. For a cost of almost $10,000.

Which leaves a CPA of $161.29 per newsletter subscriber. Far more than the estimated $20 CPA through PPC.

Overall, the point is that syndication of content - whether via YouTube, Google Search, or news services, can provide a free source of traffic.

However, the cost of the development of that content still needs to be factored into acquisition costs - something Marketing Experiments obviously failed to do.

October 23, 2006

W3C opens secure web initiative

Link: W3C opens secure web initiative

Filed under: Internet, Security, Web Development, Phishing by Brian Turner

Computers & Internet

The W3C has announced plans to try and make the web more secure for surfers.

The aim is to try and develop a shared set of trust standards via its Web Security Context Working Group, that can help web users have a better idea of the security context of webpages they visit.

It comes after warnings throughout 2006 that surfers are being snared into visiting websites which try to download malware on visitor machines, which are found in both natural and paid-for search engine results.

Google has already tried to implement a system to help warn users when websites may be unsafe.

However, the Stop Badware campaign has been criticised for not properly adjudicating threats, and for blacklisting websites even after a temporary lapse in security.

The issue of phishing also continues to be a serious concern.

While it’s a laudable initiative that the W3C are launching, it remains uncertain as to how easily goals can be met - without them being comprised by third-parties.

However, it remains to be seen if any significant developments can be made from the project.

October 18, 2006

IPTV video clips faces EU regulation

Link: IPTV video clips faces EU regulation

Filed under: Internet, Web Development, IPTV by Brian Turner


An EU directive currently in circulation threatens to force broadcasting regulations on the internet.

The Television Without Frontiers directive threatens to force any website hosting video content - whether streaming IPTV, or simply hosting video clips - to broadcasting regulations.

This would mean anyone uploading a simple video clip of themselves, local club, or any other legal content, to require a licence first.

The EU argues that video content online needs to be regulated for the purposes of advertising, hate speech and the protection of children.

However, the UK government is already arguing the case against broadcasting regulation, pointing out that the UK already has vigorous protections in these areas.

Overall, it seems that yet again politicians and policy makers remain a couple of steps behind technology, then attempt to rush forward with regulation that suggests they’re keeping up with it - applying a big fudge in the process.

We can only hope that common-sense prevails.

October 16, 2006

Help for small business continues as Platinax celebrates 2 years online

Link: Help for small business continues as Platinax celebrates 2 years online

Computers & Internet

Platinax is a free business resource and mentoring service that celebrates 2 years helping SME’s this month.

Based in the Highlands of Scotland, Platinax is run by a volunteer staff of company directors from across the UK.

The key aim of Platinax is to help other SME’s work better with the internet and increase profitability.

This is namely due to the fact that the majority of businesses in the UK are still failing to overcome basic challenges in technical, presentation, and marketing areas. Challenges that can be easily solved with a simple awareness of the issues involved.

Platinax offers help through these key main areas:

1. Business Forum

The Platinax Business Forum is a place where business owners seek free solutions, learn tips and tricks, and network with other business owners.

Key discussion areas include:

Business planning, Accounts, Legal compliance, Marketing, Advertising, Search engine optimisation, Internet technology, Website management, Webdesign & development, Server hosting.

2. Business Directory

The Platinax Directory is rated as one of the internet’s best quality directories.

It’s also one of the most active areas of the website, and can help provide national and local listings to drive targeted sales.

Simply click through the directory links to a suitable category for a listing in local, national, and international business categories.

3. Small Business News

Platinax News provides latest business & technology information for SME’s in the UK, and is syndicated by Google News:

4. Business articles

Platinax offers an exclusive reading list of articles, tutorials, and interviews for those willing to take on the technical challaneges of the internet. Key subject areas include internet marketing, website management, and business practices online.

5. Advertising

Platinax receives an average of between 350,000 - 400,000 page views per month.

You can reach this audience directly with an advertising space on the left of every page, containing both a banner and direct link to your website.

Through October Platinax is offering a special promotional price of just £79.95/month - but only 4 advertising spaces are available.

- According to Dave Ashton at lead generation service Bizal Ltd:

“From the new business enquiries received to date over 70% have been genuine opportunities that have been able to quote for either a lead generation solution or sales development services i.e. sales training and hence are delighted with types of companies that see our banner advert.

We are also delighted with the level of customer service and hence look forward to continuing a mutually beneficial relationship.”

Platinax intends to continue to provide free help and support for SME’s in the UK, and looks forward to celebrating the next 2 years on the internet.

September 26, 2006

Net delivers mobile domains

Link: Net delivers mobile domains

Filed under: Internet, Webhosting, Web Development, Browsers, Mobile by Brian Turner


Public registrations of .mobi - the newly accredited domain name extension - has opened up today.

The aim of .mobi domain names is to cater specifically for mobile device users.

The domain registration body overseeing the project - Mobile Top Level Domain (MTLD) - insists they will ensure that web standards are applied to ensure they work well with mobile users.

It comes at a time when both internet use and mobile phone use has become a global phenomon - but although mobile phones themselves are often internet enabled, few mobile phone users access the net with them.

Key reasons include speed and cost, with mobile phone users having to pay much higher prices for internet access which is slower than broadband.

Additionally, the majority of websites simply will not display properly on a mobile browser, further reducing any incentive for users to go online with their mobile devices.

Although 13,000 .mobi domain names have already been registered, MTLD expects to see another 200,000 registered over the next year.

However, the whole .mobi domain launch could be something of a pink elephant.

In the event that mobile devices find themselves able to display web pages properly, making for a proper user experience, then the uniqueness of the domain could well become irrelevant.

July 27, 2006

Teen mags move online

Link: Teen mags move online

Filed under: Business, Internet, Marketing, Web Development by Brian Turner


Teen People, teen magazine that launched both print and web editions, is closing down the print edition because the internet version is much more popular.

It becomes the latest in a string of teen publishers who have found that teen audiences are much more interested in online content than printed content.

Earlier this year Smash Hits magazine, aimed at the teen popular music market, announced it was closing down because of loss of sales to its print magazine, because its audience was looking for the information online instead.

Overall, it shows that publishers absolutely need to keep an eye on the impact of the net on their audiences, and that the overheads involved in print runs can often fail to justify themselves in the face of online publishing.

July 14, 2006

Search engines support removal of DMOZ titles

Link: Search engines support removal of DMOZ titles


In another move to empower webmasters, a new meta-tag is being supported by some search engines.

The new meta-tag tells search engines to use the actual page description and title, as opposed to a DMOZ description and title.

This is because Google and MSN have for some time been using DMOZ description and titles for websites listed in the Open Directory Project (ODP) - aka DMOZ - in search results.

Because of the varying quality of DMOZ editorial descriptions, this could mean a website not getting click-throughs from search results where the description was poor.

This was illustrated last October, when Brian Turner from Britecorp asked for his listing to be removed from DMOZ - because the description for his internet marketing website was simply: “Includes service details, webmaster articles, and contact details”.

The new move allows webmasters to opt out from the DMOZ title and description, to instead use their own.

March 30, 2006

Government websites failing

Link: Government websites failing

Filed under: Internet, Webmaster, Web Development by Brian Turner

Computers & Internet

The University of Southampton has released a study that shows that 60% of UK government websites contain serious HTML errors, and fail W3C standards on website design.

A big part of the problem is that the websites are designed specifically for Internet Explorer - which does not adhere to internet standards, and even introduces some of its own.

This means that people not using Internet Explorer can often have problems using the websites. This is an issues that was highlighted by SciVisum last year.

Perhaps more seriously, it also means that persons with disabilities may not be able to use many government websites - which could mean these websites break the law for accessibility for peoeple with disabilities, as set out under the Disabilities Discrimination Act.

The problem of bad website design and coding is so acute that the Disabilities Rights Commission has published PAS 78, a guideline on good practice to designing websites.

An even greater problem for the government websites is looming - Microsoft is expected to release Internet Explorer 7 later this year, and it is expected to adhere more properly to W3C standards.

Already some banks are warning that their internet banking systems will not be accessible with the new browser, as they fail so miserably on standards.

And until the government overhauls its failing websites, once Internet Explorer 7 is released, the sites could fail more than just a minority of users.

For more information on how to help get your website up to date on W3C standards, there are plenty of online communities that can helps, such as:

UPDATE: Microsoft has announced that it’s changing how Internet Explorer 6 works with ActiveX controls on the next major patch.

This will prevent a lot of websites from being able to use existing ActiveX controls in their website design - especially for delivering advertising. Developers have been given 60 days from the date of the patch release to change, or risk running dysfunctional websites.

March 8, 2006

PAS 78: British standards for website

Link: PAS 78: British standards for website

Filed under: Webmaster, Web Development by Brian Turner

Computers & Internet

The British Standards Institute (BSI) has released information on a new standard - for disability access for websites.

Known as “Publicly Available Specification (PAS) 78 – Guide to Good Practice in Commissioning Accessible Websites”, it was developed with the Disability Rights Commission, after a study claimed that 81% of websites were not properly accessible to people with disabilities.

According to the BSI, applying PAS 78 will provide:

  • compliance with the Disability Discrimination Act (DDA),
  • the creation of accessible websites,
  • wider audience reach,
  • improvement of search engine listings due to accessible content,
  • the easy transfer of this content to other media such as interactive TV or mobile phones

It remains to be seen how much impact PAS 78 will have in the world of web development, where similar standards have supposed to have been in place since the year 2000, via W3C.

However, the coming launch of Internet Explorer 7, which is believed to adhere more properly to web standards, could provide the necessary push to seeing common adoption of PAS 78.

February 22, 2006

Broadband reaches 64% of UK

Link: Broadband reaches 64% of UK

Filed under: Internet, Marketing, Web Development by Brian Turner

Computers & Internet

The BBC reports that broadband now accounts for 64% of internet use in Britain.

It also suggests that broadband will continue to become more widely used in Britain as companies compete for the market, and greater broadband speeds are made available.

Additional information on internet use comes from figres from the ONS for December.

According to these figures are the following -


  • 61 per cent had bought or ordered goods, tickets or services
  • People aged 25-44 were most likely to buy on-line (67 per cent)
  • People aged 65 and over were least likely to buy on-line (41 per cent)

Place of access:

  • Home: 86%
  • Work: 48%
  • Another person’s house: 33%
  • Education facility: 16%
  • Public library: 10%

Types of use:

  • 92% had used a search engines to find information
  • 78% sent an e-mail with an attachment
  • 32% posted in an online community
  • 22% had used file-swapping
  • 20% built a webpage

Additionally, another report by the Office for National Statistics details internet use by region, and finds that London and the South East are the biggest internet users.

Overall, it the information shows that internet use is fast becoming an integral part of life in the UK.

Nominet faces rebellion on changes

Link: Nominet faces rebellion on changes

Filed under: Webmaster, Webhosting, Web Development by Brian Turner

Computers & Internet

Nominet, the governing body in charge of overseeing UK domains, is facing rebellion over plans to overhaul the company.

Advocates say that Nominet needs to change because it wasn’t structured to cope with the vast market in UK domain names that now exists.

Critics are concerned that the changes will jeopardise the body’s non-profit status, and put it’s interests at odds with UK domain registrar companies.

While it is certain that Nominet will need to restructure some of its operations, if the current changes being proposed end up as a platform for creating a public for-profit company, then it could certainly increase the price of UK domains in future.

June 23, 2005

SciVisum finds website standards lacking

Link: SciVisum finds website standards lacking

Filed under: Web Development by brian_turner


Web-testing firm SciVisum assessed 100 leading consumer websites and found that 10% of them did not work correctly on the Firefox web browser, including government website Jobcentreplus.gov.uk.

Firefox, which was launched in November 2004, is an open source alternative to Microsoft’s Internet Explorer. Although most people still use Internet Explorer, Firefox, created by Mozilla, is gaining in popularity. Its share of the browser market increased to 8% in May, compared with 5.59% at the beginning of the year, according to US-based analysts NetApplications. Internet Explorer’s share of the market dropped to 87.23% in May, compared to 90.31% in January.

SciVisum found that of the websites it tested, 3% turned away non Internet Explorer (IE) users and 7% included non-standard code recognised only by Internet Explorer.

Deri Jones, chief executive of SciVisum said “web developers are used to testing their sites just using IE rather than so-called standards-compliant browsers, which only use code ratified by the World Wide Web consortium.

“There is a certain business logic to this as IE is the most widely used browser”.

The success of Firefox has prompted Microsoft to start developing a new version of IE.

Sites which meet World Wide Web consortium standards are easier for disabled people to use. Mr Jones said “Over time developers have begun to misuse the original standards created for the web to create websites that look great to you and I, but are confusing to a disabled person using a screen reader which needs to make sense of the content”.

However, even the SciVisum site was later found to not be W3C compliant.

March 28, 2005

Adobe Photoshop CS2: announcement coming

Link: Adobe Photoshop CS2: announcement coming

Filed under: Web Development by brian_turner

Rumours are circulating that Adobe are about to announce plans of the next stage in their Photoshop graphics software, which will be named “Adobe Photoshop CS2″.

Adobe Photoshop is one of the most common graphics software packages among graphic designers, and a market leader in graphical editing.

The new Adobe Photoshop CS2 software will apparently feature:

  • New Vanishing Point Feature,
  • Dramatic Camera Raw Workflow Enhancements,
  • Customizable Menus
  • Smart Objects Headline Milestone

“Photoshop CS2 pushes the envelope with powerful features and simplified workflows that provide photographers and creative professionals the freedom to deliver stunning images,” said Bryan Lamkin, senior vice president of Digital Imaging and Digital Video Products at Adobe.

News of the new Adobe Photoshop CS2 software only happened after a press releas dated for April was accidently published online, and Google indexed and recored the page.

Screenshots have already been released on the new Adobe Photoshop CS2.

January 26, 2005

W3C issue XML & SOAP standards guidelines

Link: W3C issue XML & SOAP standards guidelines

Filed under: Web Development, Programming by brian_turner

The W3C consortium, a body that develops internet accessibility standards, has released a new set of standards guidelines for use with XML and SOAP applications.

XML-binary Optimized Packaging (XOP), SOAP Message Transmission Optimization Mechanism (MTOM) and Resource Representation SOAP Header Block (RRSHB) are all designed to help bridge the use and development of binary data with XML.

This is important for addressing issues such as sending a video clip from a mobile device to a desktop PC, as lack of co-ordination between different software packages can make even this apparently simple act slow and difficult to manage.

Although intended as a work around, rather than a full fix, the W3C XML Binary Characterization Working Group is already looking into fully comprehensive ways in which XML and binary data can be better brought together.

January 19, 2005

Critical PHP bug slows dynamic applications

Link: Critical PHP bug slows dynamic applications

Filed under: Web Development, Programming by brian_turner

After the recent security concerns with PHP, upgrades from PHP 4.3.9 to PHP 4.3.10 have left some dynamic applications with serious problems with slowed performance.

In a report at the PHP development community, Bug #31332 unserialize() works terribly slow on huge strings compared to 4.3.9, it is pointed out that this error is critical for many php based systems, such as vBulletin and Drupal.

The issue centers around use of the unserialize() function, which when used on serialized multidimensional arrays, can result in a a slowdown in the application of stored data by as much as a factor of 20.

In layman’s terms: if the software you run is like a pub, and the database the software runs from is like the beer cellar, then the barman now has lead boots.

A workaround is already in beta format, and a public release of the patch is expected soon.

Next Page »