Skip to main content
  1. Blog
  2. Article

Amrisha Prashar
on 4 February 2016

#UbuntuAtMWC Competition


We’re really excited about MWC this year and can’t wait to showcase the various demos across Cloud and Devices!

For those of you not going we’d love to offer you the chance to come and join us and see what we have on display – as our presence is even bigger and better than last year.

To win tickets to MWC we’d like you to tell us or show us why you want to see Ubuntu at MWC in a tweet! Feel free to be as creative as possible with text, images, GIFs or even videos and add the hashtag #UbuntuAtMWC so we can track your awesome entries. The competition starts on 4th Feb (17:00 GMT) and finishes just before the stroke of midnight on Valentines day – 14th Feb (23:55 GMT.) Our creative team will judge the entries based on originality and select three winners where we’ll contact you via Twitter.

Happy creating and we can’t wait for your entries!

Terms and Conditions #UbuntuAtMWC

Related posts


Johann Wolf
27 April 2026

Why Web Engineering is great

Ubuntu Article

Like many software engineers, one of my first software development experiences started with creating my own web page. Since that time 20+ years ago, a lot has changed in the web landscape. Having worked a lot in web since then, I’d like to take a moment to reflect on what I think makes web great! ...


Ishani Ghoshal
27 April 2026

Ubuntu 16.04 LTS has reached the end of standard Expanded Security Maintenance with Ubuntu Pro. Here are your options.

Ubuntu Article

Ubuntu 16.04 LTS (Xenial Xerus) reached the end of its five-year Expanded Security Maintenance (ESM) window in April 2026. If you are still running 16.04, it is critical to address your support status to ensure continued security and compliance. Your support options Now that 16.04 is in its Legacy phase, you have two primary paths: ...


Rob Gibbon
27 April 2026

Understanding disaggregated GenAI model serving with llm-d

AI Article

What is llm-d? llm-d is an open source solution for managing high-scale, high-performance Large Language Model (LLM) deployments. LLMs are at the heart of generative AI – so when you chat with ChatGPT or Gemini, you’re talking to an LLM. Simple LLM deployments – where an LLM is deployed to a single server – can ...