Select Page

Embedding video in blogs

Another in an occasional series – collating my notes on blogging and infrastructure that I’ve sent to my customers.

This time discussing the fun of getting video embedded into your blog.

Background

The original conversation was when I was self hosting. Dropping a video into the blog was easy – the issues were really around performance (slow uplink) and impact to other services (such as VOIP). Being given a denial-of-service from your own infrastructure wasn’t a good plan. Every time a video was viewed the uplink was hosed – and services were down.

The variables

Performance. What is the user experience pulling down a video. What if they are in Seattle, New York, New Zealand, London, Berlin? Is there any streaming or does the whole video need to download?

Privacy. Really about access control – and how to stop private family videos getting leached or viewed.

Ease of use. Facebook is a great example of what people (consumers) expect. Shoot video on phone, link to a blog/email, click done. All of the magic (rendering, transcoding, codec mungling, upload, CDN placement) is done behind the scenes. 

Cost. Free is fantastic – but the elasticity of the above requirements changes that somewhat. For most of my blogging and video posting I’ve tended to rank:

  • Ease of use
  • Privacy
  • Cost
  • Performance

Somewhere in that mix is an immutable value that the cost cannot exceed ($5 per month? $10? $50 seems too much) – it should be ‘an affordable luxury’ as my friend Steffi put it.

The experiments

Self-hosting. Posting MPGs and serving them via Apache.

  • Ease of use – not great. Getting video to the blog was not a problem (ftp or xml-rpc) but embedding it was pretty crappy.
  • Privacy – good. Same authentication as the blog and photos.
  • Cost – fantastic. No additional expense.
  • Performance – very poor. Slow to serve, degraded service at home – all round a no go.

Amazon S3. Posting MPGs to S3

  • Ease of use – ouch. Much harder than self-host – I couldn’t see myself explaining to customers about S3 buckets, upload, setting security and finally embedding into a blog.
  • Privacy – medium. Security by obscurity.
  • Cost – good. Low cost to upload and store. Low cost to serve from S3.
  • Performance. Better than self host. European S3 buckets worked well for non-US visitors. Poor experience all round for non-US and non-EU visitors. US-only content was pretty slow for EU viewers – but better than self-host.

Post to shared hosting server. Post MPGs to shared blog.

Much the hybrid between the above two. Benefits to performance.

Video hosting service – YouTube

  • Ease of use – easy. Upload, click, done. Good plugins for embed to blog.
  • Privacy – poor. Private video – requires YouTube/Google login to view; share to 50 contacts. Or unlisted – which is leachable. Security by obscurity.
  • Cost – fantastic. Zero.
  • Performance – good. Fast and widely available. Good mobile support.

Video streaming/hosting service – Bits on the Run

  • Ease of use – easy. Upload, click, done. Good plugin for embed.
  • Privacy – medium. Security by obscurity – but less obvious than YouTube. Also can be protected by preventing download/leach.
  • Cost – medium. Storage and egress costs.
  • Performance – good. Uses Cloudfront CDN (soon to change??) which has global distribution – works well across the US, Europe and AP.

After hacking around with all of the options I decided to use Bits on the Run for hosting and streaming videos. I’ve been really happy to date. It’s costing me around $10 per month.

The conversation

Here’s the first mail – some slight changes – but the main tradeoffs are in there.

So what to look for:

– Performance

– Storage and bandwidth costs

It’s also a lot better experience for your various viewers. The video content is pushed out to a global content delivery network – see map below – so no matter whether you’re in Munchen or Sydney – you’ll get a smooth, fast video.

It’s all about your tradeoffs:

– Privacy (none with youtube)

– Speed (can you watch it without a painful wait)

– Technology (do you need a phd to get the video edited, uploaded, converted to the right format, put into the blog

– Cost

It’s the usual elastic maths – I ended up using a video streaming service because it really makes the speed and technology easy (upload and it’s done) and it integrates with the blog (easy) – but it comes at a price.

Then the follow up

The challenges of getting video securely into your blog were numerous:
– performance (slow, stop/start video, poor playback)
– technically hard
– getting the video into a format that was useful to play on the web
I’ve been testing out a video-on-demand service that seems to fix these issues. All you need to do is have your video ready to upload in AVI or MP4 format, upload it – then embed it into your blog. It should be that simple. All of the technical gubbins behind the scenes is looked after – along with getting the video to the right place on the web.
Bits on the run is a pay service that does all of this – and it’s reasonably priced. Notice that it’s not FREE. Depending on how many people download your videos – you might have some surprises. On the positive side it does let you control quite carefully the security (i.e. it’s not on YouTube with a kabazillion people able to watch your family) and it moves the videos to a close point on the web for viewers. Tie that into the web-site/blog security that’s already in place – and it’s pretty good all round.
Here are the points around the globe where videos are stored. This means that people in the US, Europe, Japan and Aus/NZ can view the videos with good performance.

Killing boxes

I’ve got rid of five more servers out of the rack this weekend.

Web server – this was a nice fat server for virtualisation; now repurposed with a nice video card and it’s a media server for the home theatre.

Old media server is now dead. So long!

Old MP3/itunes/TVersity server is now consolidated with the new media server – the last Windows 2003 box is now gone! So long!

I also managed to decommission the old management server – the originally was running ZCM10 (natch) then ConfigManager then SCE. That’s all managed from the cloud – so that box is dead – along with the SQL server that was next to it. So long!

I’ll take a pic of the empty rack. It’s got two UPS, the archive server and the firewall – that’s it. Looks pretty lonely.

The end of an era – no more self-host

After a decade of static IP and reasonable bandwidth I finally ended my ‘experiment’ and ‘learning lab’ which was self-hosting.

Before we moved to the US I ran NetWare 5.1 from my WW2-era bomb shelter under my house in Nottingham. This was the home for my email (running NIMS – subsquently NetMail) and a semi-static list of resources, links and thoughts.

After the move to the US the server went through NetWare 6.5 and then on to Red Hat Linux 7, then to SLES 8. SLES 8 served as an admirable photo and notepad written blog platform for when kid #1 appeared in mid 2003. The handwritten weblog soon evolved to blogger.

In early 2005 the number of photos and the number of blog updates grew too large – and a multiple update to SLES 9 and WordPress was called for. NetMail moved from Red Hat to Windows Server and then on to SLES 10 (at the time showing little innovation, shortly after sold off) up to Google Apps.

The final incarnation of the blog server was running SLES 11 SP1 on top of Hyper-V/Windows Server 2008 R2. Still running WordPress and all of the various addons.

The server is now offline; the VM backed up – it’s going to be rebuilt as a media server.

The blogs and photos are all now hosted up on GoDaddy – and mail is hosted Exchange.

It’s been a great learning lab – firewalls, hosting, Apache, MySQL; it’s given me some great experience with change control and planning – and sometimes things just went wrong.

Infrastructure changes – Covad and GoDaddy frustrations

Most of the way through the infrastructure changes at the moment.

A recap:

  • Migration of mail from Google Apps to Hosted Exchange.
  • Migration of DNS from current service provider to ‘someone new’
  • Migration of blog/photos to ‘somewhere in the cloud’

Step one – the mail switch was relatively painless – it needed some careful planning – but zero downtime.

  • signup for Hosted Exchange (actually the Microsoft Exchange Labs Friends and Family program)
  • DNS changes (mainly CNAME work to prove ownership)
  • family education (the hard part)
  • DNS changes (MX records and webmail A and CNAME records)
  • reconfiguration of email clients (Outlook, phones, devices etc)

As I wrote a couple of months ago – the old mail lives on at Google Apps. Everything new is in Exchange.

Step two of the move was more complex and painful. I decided to change the DNS hosting with a consolidation of the various registrars that I’d used over the past decade. What should have been a week-long process of sign-up, DNS unlock, auth code request and move – took most of my time.

I moved from register.com and Network Solutions (resold by Covad). Getting the domains unlocked and auth codes for the move were a snap with register.com – they were efficient, friendly, knowledgeable – and it took about five days. Covad was a nightmare. Total time – five weeks and multiple escalations. During that time Covad managed to completely screw up the zones too.

Step three is mostly complete too. Only one blog site to move – and the photos are uploading right now. This is really my frustration with GoDaddy. They have pretty good (i.e. I get what I pay for) hosting and infrastructure – but some of the grid hosting limitations and the associated responses from support are really frustrating.

The GoDaddy issue is that they either cycle the grid hosts (so an ssh/scp session is terminated) or they kill long running processes. With four photo blogs – some insane number of photos – total of some 80GB of data to move – I had to get creative.

Firstly copying the data via non-secure ftp wasn’t really my idea of fun. I started off with scp – but the remote host kept killing the connection. Next I tarred up the needed files – and the connection was killed. The final working solution to get the pictures up to GoDaddy was the convoluted tar – md5sum – split – scp – cat – md5sum  – untar. Moving 80GB in 200MB chunks with a retry script at my end was not fun.

The next issue was actually untarring these enormous tarballs. The first site unpacked just fine; the second kept being interrupted – i.e. tar was getting killed. There is no ‘nice’ on the server – so no way to fly below the radar. Turns out there is a process time limit of something like 180 seconds. This means that the practical limit to untar is about 13GB in size. My frustration with GoDaddy support was that they kept telling me to use ftp and that there was a limit of 100MB for tar. I spoke to GoDaddy support right at the start of this process and offered to PAY to ship a USB drive with the 80GB of tarballs to an admin to dump onto my space. I’d say there’s a value add here for GoDaddy.

Lessons learned:

Change control and planning are king. See previous posts. Nothing went wrong – but there were things that could have been smoother. What I guessed was a few weeks turned into a two month project.

Test with real-world datasets. Migrating a test blog with 200 photos isn’t a valid test.

First line support people often repost from the knowledgebase. A limit of 100MB for tar is unrealistic. Tell people it’s a time-related kill rather than a size issue. We can figure it out and workaround it.

Uploading mysql dumps to GoDaddy

Strictly a console guy – I’ve been struggling to get the big blog database dumps up to the new hosting. phpMyAdmin claims to support zipped dumps – but that doesn’t work. There are also timeouts in the console for the upload and import.

I finally fixed it by using scp to move the non-compressed dump to the hosting server; and then using the Hosting Control Center to restore the dump as if it was a backup.

It’s running right now – so hopefully I’ll have happy blogs again soon.

Infrastructure – heavy lifting and planning

A trio of projects before the year-end – all interwined.

  • Migration of mail from Google Apps to Hosted Exchange.
  • Migration of DNS from current service provider to ‘someone new’
  • Migration of blog/photos to ‘somewhere in the cloud’

Moving the mail isn’t that hard – it’s just making sure that mail doesn’t get dropped while the new MX and CNAMEs are propagating. The old mail will live on in Google Apps – the new stuff in hosted Exchange. The trickier part is making sure that ‘my customers’ get the right service – and can keep getting mail in Outlook or the web. Users eh.

Moving the DNS is part of the mid-term strategy to change ISP. Covad have been great to me since I moved to the US; sadly they are starting to show signs of decay. I need to support additional DNS records than the A, CNAME and MX records – no plans from Covad.

The final push is to move the blog servers out of the ‘home data centre’ and to a reliable, faster provider.

The ultimate aim is to divorce myself from Covad and the Static IP business DSL that has worked so well – and move to something that is much faster – but maybe without the SLA on the line itself.

xCache – PHP caching, performance and stability

I’ve been testing out xCache for a while – primarily as a PHP accelerator.

Early results were really promising – reducing page load times dramatically; and also reducing CPU load as common pages (i.e. the latest blog post and photos) were fed directly from the cache.

There seems to be some kind of memory leak/cache clean up issue with xCache 1.3 – I allocate some amount of RAM for cache (16MB, 64MB, 256MB – it really doesn’t matter) and at some point Apache/PHP starts eating up RAM, then starting to swap – and finally the server grinds to a halt.

xCache is off for now – I’ll keep investigating.