Author Archives: aibrahim

How I used UBER almost 2 years for free

Between 2015 and 2017 I have referred hundreds of people to Uber, I made 42,110 LE worth of referral credit, that amount covered hundreds of my trips during that time period.

I remember it was April 2015 when I was sitting with 2 of my best friends in a restaurant talking about Uber, and how it had helped a lot in the transportation sector in Egypt.

By that time, I only used Uber for few times. Then we started talking about their referral system, which was 70 LE per referral at that time. which was enough to do a complete trip from New Cairo, or October to downtown at that time.

One of my friends mentioned his colleague at work who made few thousands of credit and now rides Uber for free everyday, I asked him why you don’t make the same. I don’t see it is hard to do and you use Uber a lot. He replied “It is not me”.

Going home that day, I couldn’t take the idea off my mind. Given my car was stolen a few months back and moving around Cairo was hectic for me. I decided to act. and I formed a simple plan, and all I needed is:

1- A sexy referral code
2- A catchy picture of Uber being used in Egypt
3- An Arabic description of the service covering all concerns raised at that time like safety and payment, etc

I went to my uber account and customized my code to be “GetUberRide4Free”, got a picture of an Egyptian celebrity getting into an uber, and found a good copy of service description online.

Armed with the perfect post, Now it was time to look for the medium I can post it in, I picked 2 mediums:

1- Whatsapp family group.
2- Facebook.

Whatsapp did great for me, people in those groups won’t bother to go and make their own referral code. they will just forward your message to their other groups as it is. I was also answering all their concerns and questions about the service, and in family gathering I would show them how to use the service. I was like one of Uber’s first line customer support ūüôā

Facebook was another story, because the post was gaining momentum with lots of shares, and with my continuous responses to people questions, some Facebook pages with hundreds of followers have copied it as it is without changing the referral code, and the awesome thing is they started answering members questions about the service, convincing more people to join, and luckily it was my referral code.

See how many comments and shares this post had? It was pure viral

After that for months, my mobile would flash every few minutes, with en email like this

My uber credit exceeded my bank account credit sometimes too ūüėČ

I made 42,110 LE worth of Uber credit, My phone was like Alaadin lamb, I could move from one place to another.

This was my experience with UBER referral program. It was definitely worth the try.

AWS Summit London 2019

Yesterday I had the chance to attend AWS summit London with some of my colleagues at Zava.

We have become more of an aws shop recently, after moving the majority of our infrastructure to the cloud.

Besides attending the sessions, and touring around to check the booths, and enjoy the freebies AWS gives for the certified engineers. I have also shot for the ML workshop as I did back in December in the AWS builders day 2018

The first session was titled “IOT and Alexa in connected homes
The interesting part for me in it was how you can use Alexa to interact with your users to extract specific information that you can use in your workflow, like order placement, or online help.
They also showcased a study about how Lancaster university implemented a voice-enabled service for the students in 120 days with awesome features.
Some slides documenting their journey can be found here

The second session was titled “Build data driven high performance internet-scale applications with aws databases
It was database architecture strategies and best practices for building high-performance and internet-scale applications using Amazon DynamoDB, Amazon Timestream, and Amazon ElastiCache.
They also showcased how the Guardian moved away from MongoDB after more than one production downtime, to RDS PostgresSQL

One of the slides that helps with your DB architecture decisions

The third session I attended was “Modern application architecture
That one was purely a marketing session showing more AWS tools, and motivations to use them in your architecture. they also showcased how they implemented many tools with Sage, specifically the CI/CD process.

After the sessions, I attended the “Using AI/ML to Personalize your Recommendations” workshop for about 2 hours, which wasn’t enough to train all the models and create all the solutions campaigns.
I got to use the AWS personalize which is still in preview, and I see how much manual work it saved when I did similar examples back in December.
You can find the full workshop guide here
But you can’t implement it until the service becomes generally available.
At night I checked the status for the training/campaigns creation and was happy that it worked without unexpected issues.
You can see a demo of the app we built here

All the AWS summit 2019 slides are available at this link

AWS Solutions Architect Associate exam experience and study tips

So I have had my AWS solutions architect associate certificate last week, with a score of 939 out of 1000

Over the past 2 years, I’ve been dealing with AWS from time to time, but that has increased significantly in the past 5 months, When I started to do a lot of work related to s3, lambda, and step-functions.

On December I’ve decided to take the architect exam to sharpen aws/cloud skills. I prepared for it for less than a month. and took the exam on Mid January.

I’ve prepared for the exam from multiple resources, but I must say that the exam itself is harder than all the practice tests I’ve seen, there were some topics that never was mentioned in the materials I will mention below. but you just get them from your experience.

So first I’ve started with a monthly subscription in¬†¬† course: aws certified solutions architect associate

One month subscription was enough to prepare for the exam, This is a great course, it goes through almost all the topics in the exam, the labs are very important to do by hand, especially the VPC one. If you can’t do it from memory, don’t do the exam.

Then I’ve purchased the practice tests from whizlabs:

The explanation of the answers in whizlabs will help you a lot understand what is the best solution for the scenario in the questions, Most of the time there is more than one correct solution, but one of them is the right answer because of a certain requirement in the question. like cost, availability, or disaster recovery.

I’ve also gone to Jayendra Patil blog and read a lot of the pages in the aws¬†solutions architect associate exam learning path¬†

Lastly, I purchased the practice exam from AWS, which I’ve seen most of the questions in it in acloud guru exam simulator, and whizlabs¬†tests.

In summary, I think obtaining the certificate is a good addition for engineers who interact with aws services in their day to day work.

AWS Builders day 2018

AWS Builders day, December 2018, London


I’ve attended¬†few AWS events in the last couple of years in different countries, yet the best experience I had was in the AWS builders day this December 2018 in London.

The difference is this time, is I aimed for the workshops, not the sessions. I went for the AI / ML workshops, and it was a good deviation from the type of technologies I use day to day.

The reason I like this experience is that it was some sort of hands-on¬†training, I guess if I had that much information in a session it wouldn’t stick to my mind the way it did when I applied all¬†the examples by hand.

You can find the materials for these workshops in this GitHub repo,, unlike many tutorials these work as expected if you follow all the steps.

Going through the examples you will cover different aws tools like: aws rekognition, aws lex, aws ec2, aws s3, amazon machine learning, aws sagemaker, deep learning AMI, apache MXNet, and Jupyter Notebook.

I hope you will find the material as enjoyable as I did, and if you have any similar resources, please share them in the comments.

There is a better way to implement X

One of the code reviews comments that I’ve been seeing recently is “There is a better way to do this” and without further explanation.

By far this is one of the killer comments you can leave to someone. It is not only telling the coder that his code is not the best. but also leaving him with two challenging questions.

  • What is wrong about this implementation?
  • What are those better ways?

My advice to any reviewer who writes such comment, Try to write a more constructive review. One of the goals of code review is to identify the problems in the code and propose better ideas for implementation. so If you see something that is not at its best. Identify why it is not. and what are the better ways to implements it? always give the requestor something to learn, and go home with.

And If you are requesting a review then received this comment, don’t be defensive about your code. just take a deep breath, and ask the 2 questions above.

The blog is Back

After almost 4 years and a half since my last post, and a long period of taking the blog offline. I managed to revive it from the ashes and bring it up again.

More to follow…..

Adding custom mapping types in doctrine

While working in some enhancements for my startup Careerise, I had an issue with doctrine not recognising the mysql data type “blob“.

The error I was getting while running doctrine diff was :

  Unknown database type blob requested, Doctrine\DBAL\Platforms\MySqlPlatform may not support it.

In order to overcome the issue, I had to add the blob data type to doctrine.

Step 1 :

Creating a class to handle the blob type.

path : Doctrine/DBAL/Types/BlobType.php

file content :

namespace Doctrine\DBAL\Types;
use Doctrine\DBAL\Platforms\AbstractPlatform;

 * Type that maps a database BLOB to an encoded base64 value
 * @author Ahmed

class BlobType extends Type

    public function getName ()
        return TYPE::BLOB;

    public function getSQLDeclaration (array $fieldDeclaration,
            AbstractPlatform $platform)
        return $platform-&gt;getDoctrineTypeMapping('BLOB');

    public function convertToDatabaseValue ($value, AbstractPlatform $platform)
        return ($value === null) ? null : base64_encode($value);

    public function convertToPHPValue ($value, AbstractPlatform $platform)
        return ($value === null) ? null : base64_decode($value);

Step 2: Add the blob type name to the Type abstract class

File Path : Doctrine/DBAL/Types/Type.php

    const BLOB = 'blob';

Step 3: Modify the DoctrineTypeMappings for mysql

File Path: Doctrine/DBAL/Platforms/MySqlPlatform.php

Append the blob type to the end of the array, ex :

protected function initializeDoctrineTypeMappings()
        $this->doctrineTypeMapping = array(
            'tinyint'       => 'boolean',
            'smallint'      => 'smallint',
            'mediumint'     => 'integer',
            'int'           => 'integer',
            'integer'       => 'integer',
            'bigint'        => 'bigint',
            'tinytext'      => 'text',
            'mediumtext'    => 'text',
            'longtext'      => 'text',
            'text'          => 'text',
            'varchar'       => 'string',
            'string'        => 'string',
            'char'          => 'string',
            'date'          => 'date',
            'datetime'      => 'datetime',
            'timestamp'     => 'datetime',
            'time'          => 'time',
            'float'         => 'float',
            'double'        => 'float',
            'real'          => 'float',
            'decimal'       => 'decimal',
            'numeric'       => 'decimal',
            'year'          => 'date',
            'blob'          => 'blob',

now doctrine will be able to recognise the mysql data type blob.

the documentation here was helpful in some of the steps above : Custom Mapping Types

How to reduce your virtualbox linux virtual machine size

After using a linux virtual machine that has been configured with dynamically expanding drive, the size gets larger even if you remove files from it.

Here is the steps to shrink your linux virtual machine size :

1- Remove any unnecessary files you don’t need from the virtual machine.

2- Fill the unused spaces on it with zeros.

  • Open the virtual machine.
  • Install zero free
sudo apt-get install zerofree
  • Reboot the machine in repair mode, and login to root shell
  • Kill any processes using the hard disk.
service rsyslog stop
service network-manager stop
killall dhclient
  • list your virtual machine harddisks
  • Unmount and Fill all the unused spaces for each physical drive on the machine with zeros.
mount -n -o remount,ro -t ext3 /dev/sda1 /
zerofree /dev/sda1
  • Shutdown the machine.

3- Clone the old HD file into a new shrinked one

VBoxManage clonehd oldHd.vdi newHd.vdi

4- Now add the new cloned HD file as the new HD for the machine and test it.

5- Remove your old HD file.


Second life and Virtual Worlds on the web

I had had the chance to do some investigations on how to create a second life similar application that run on the web, and allows people to communicate, chat, and interact in different ways.

The applications of such tools can be virtual meeting rooms, small virtual world for people to share or discuss some cause or idea. there is also the fun part where people can interact as in a game, purchase items, go to virtual malls, etc.

Here is some links to the wikis of tools that can create virtual worlds:

I hope those would be useful for any one interested in virtual world applications, Have any urls to add to those, please let me know .

Using Juniper Network on Ubuntu 32 and 64 bit without Firefox

Lately i have moved from Ubuntu 32bit to 64bit to make use of the 8G memory, and since I have been struggling to get our juniper vpn connection working, turned out it is not compatible with 64bit binaries of java and firefox.

I found this great tutorial here to have the juniper network VPN working from command line

the only disadvantage of this is you won’t have a status window to monitor the connection activity.

This has been tested on Ubuntu 10.10 maverick.