Expertise

SEO DATA DEV

I offer an integrated skillset comprising front end web development, search engine optimisation, and data services.

I am fluent in a range of front end languages and frameworks and experienced with a variety of popular content management systems, SEO tools, and data management platforms.

SEO DATA DEV

Web development

I build fast, secure, and responsive websites which incorporate the latest web standards and protocols. I can code custom sites with bespoke features and functionality or work within various popular content management systems.

Learn more

SEO

I have worked within the digital marketing industry as a technical SEO analyst, working alongside leading brands and organisations to enhance and sustain their organic rankings, user-engagement, and conversion rates.

Learn more

Data & analytics

I'm a junior data engineer and have experience building reporting frameworks for organic and paid marketing activity. I am learning the principles of data science in order to better understand the data I work with.

Learn more

Web development

Core technologies and frameworks
A vertical bar chart showing the number of hours I have spent using the following languages and frameworks: HTML (22000 hours), CSS (22000 hours), JavaScript/jQuery (16000 hours), Bootstrap (12820 hours), SCSS[Sass] (12500 hours), Composer (6000 hours),Jekyll (5000 hours), PHP (430 hours).
Content management systems
A doughnut chart showing the number of hours I have spent using the following content management systems: WordPress (20000 hours),Drupal (8760 hours), Wix (13140 hours), Joomla (2190 hours), Umbraco (433 hours), Sitecore (730 hours), Magento (700 hours), Shopify (600 hours), Moodle (800 hours).

Additional technologies

Git logo
Git

I use Git every day when developing. Each site I build has its own private external repository so that I can manage changes over time and experiment with speculative features safely within branches.

Command line reading 'CLI' as input
Command Line Interface

I use the zsh shell when developing, especially when I’m working with Drupal. I also use the command line in the course of my SEO and data work to crawl sites, extract assets and access server-logs, along with other routine tasks.

The LaTeX typesetting logo
LaTeX

LaTeX was the first markup language that I learned, at university in order to typeset logical notation. I still use it when creating learning resources, presentations and formal documents: anytime I need to convey ideas in a clear and elegant format.

Search engine optimisation

Key skills

  • Conducting backlink audits using industry software
  • Identifying low-quality backlinks and referring domains and orchestrating their disavowal
  • Retaining and cultivating valuable backlinks as disclosed by site logs and third-party software

  • Utilising site crawler software, server logs and other diagnostic tools to identify factors impacting negatively on a site’s crawl budget and the the indexation of it’s pages, for example:
    • 404s
    • improper temporary redirects
    • redirect loops
    • redirect chains
    • duplicate content
  • Updating, optimising, and submitting XML sitemaps and robots.txt
  • Conducting index audits and engineering the indexation of high-value content
  • Fixing broken links, removing duplicate content and resolving orphaned pages
  • Preparing, managing and troubleshooting site migrations to ensure a smooth domain/protocol transition which retains rankings

  • Conducting keyword research to determine audience profile and identify content gaps
  • Optimising and refining on-page content designators (page titles, headings, URL structures, meta descriptions) on a rolling, iterative basis in order that content is reflective of users’ search intent
  • Using site analytics and engagement data to cull low-value pages and build on popular content

  • Auditing information architecture and navigational structures to improve accessibility, match users' search intent and improve content taxonomies
  • Improving and maintaining vertical and horizontal internal link paths to increase search engine crawler’s access to content
  • Ensuring seamless navigational user experiences across devices and viewports

  • Using various speed assessment tools to identify performance bottlenecks and resolve them via:
    • Cache optimisation
    • File reduction, compression and format optimisatio
    • Reducing/amalgamating resource requests and simplifying the critical rendering path
    • Setting up CDNs
    • Implementing AMP HTML
  • Mobile and device specific performance fixes

  • Optimising for foreign language search engines: Yandex, Baidu, Naver
  • Implementing and correctly interlinking hreflang markup
  • Consulting on optimal international domain strategies
  • Marshalling international keyword research and site metrics to provide country and continent-specific organic strategies

Tools

A horizontal bar chart showing the number of hours I have spent using the following SEO tools: Google Analytics (21900 hours), Google Search Console (21900 hours), Screaming Frog (21900 hours), AWR Cloud (20000 hours), SEMRush (1950 hours), Majestic SEO (18200), Ahrefs (14705 hours), Moz (12650 hours)

Data and analytics

Having worked within the digital marketing industry I understand the pivotal role that data attribution and modelling plays in businesses' online success.


By establishing timely, accurate and comprehensive data capture I am able to:

  • Critically assess the value and performance of existing web properties and digital content
  • Conduct detailed reports and analyses of both organic and paid marketing campaigns
  • Use sound statistical techniques to identify opportunities for further market outreach, engagement and conversions.

As a developer I possess the skills necessary to engineer clients' data architecture by integrating diverse data streams, setting-up tag scripting and placement, and conducting attribution modelling.

As an SEO analyst, I have ample experience of presenting data insights in a format that can be readily understood and acted upon.

Software

A horizontal bar chart showing the number of hours I have spent using the following tools: Google Analytics (16000 hours), Google Data Studio (14000 hours), Google Tag Manager (10000 hours), Adobe Omniture (8000 hours), Microsoft Power Bi ( 5500hours).

Experience

As an SEO analyst I compiled weekly, monthly and quarterly organic media reports on behalf of a diverse roster of clients. I outlined performance in terms of standard digital metrics (clicks, impressions, sessions, bounce rate etc), client-specific conversion goals, and revenue. Findings were discussed and explored with clients on weekly calls or presented formally at quarterly business reviews.

As well as reporting on past performance I outlined projections for the future. Using sampling and statistical methods I forecasted quarterly and yearly organic growth with a consistently high degree of accuracy, clearly demonstrating the value added by clients' partnership with the firms I represented.

Whilst at Oban International I gained valuable experience managing paid media reporting on top of my SEO work. I managed the monthly reporting cycle of the BBC World Service, integrating paid media and paid social data streams from a diverse array of platforms (Adobe Omniture, Google Ads, Facebook Ads, Instagram, Twitter and VKontakte) covering multiple international markets.