Martin Tøttrup Lancaster
This page includes a more detailed look at my CV, as well as more description on my personal projects,
including how you are viewing this webiste. Granted, this is a basic HTML page at the moment (which can always be improved),
but this webpage is just to showcase my experience, skills and enthusiasm. Focussing mainly on Data Engineering and Cloud Engineering Skills.
Hopefully this gives you more of an understanding as to who i am and what i can bring to the table!
Work Experience
Employer |
Role |
Time and Location |
Responsibilities Included |
Aer Lingus |
Data Engineer Consultant |
May 2023 - May 2025 : Dublin, Ireland (Remote) |
Aided Aer Lingus to transition to a new customer focused Customer360 (C360) program which would be used to provide insights into customer trends and centralise customer data. Tasks and skills that aided the company to boost revenue by 3% within the first 6 months of deploying the C360 program are shown below.
- ETL pipelines created for Booking data (Passenger Number Reference - PNR, Passenger Detail Reference - PDR), Financial data (Avios, Chase and Payment) and flight data (Flight Information and Control Systems - FICS, also combining it with PNR data for accurate dates and times displayed in bookings)
- All data was initially stored in S3, moving through multiple S3 buckets for data retention and security.
Data is then ingested into Snowflake via Snowpipes or SQS queues (deployed via CloudFormation).
DML was then executed using tasks containing stored procedures or via AWS MWAA to execute ordered SQL statements.
These statements then moved the data from our Premilinary Staging Area (PSA),
to our Ordinal Data Storage (ODS) which was an Inmon Model, and then on to populate tables in the Data Warehouse (DWH) Kimball model.
- MWAA DAGs were used to export data in Snowflake to Salesforce via AWS S3, this included task logging to CloudWatch log groups and stream. Exported data was logged to provide full traceability and also provide a robust solution to extract from a specific time if systems failed upstream. These exports drove customer emails such as ‘Before you fly’, Check-In, Loyalty emails and customer vouchers.
- Created Aer Lingus’ first Customer KPI Dashboard using Tableau to provide valuable business insights on customer behaviour and trends over time.
- Provided an automated API response via Lambda for customer service agents querying the booking database to provide agents with information about customers calling in about their booking. OAuth2 as used as part of an Lambda Authoriser where we would validate a bearer token received in the header of the request
- Liaised with Architects and Stakeholders to help create models/schemas as well as present methods in which the data would look to the Stakeholder so that their requirements are met.
- Implemented Data Quality checks using Monte Carlo to help monitor pipeline activity and to validate the data as it moves through our pipelines
- Setup CloudWatch metrics and alarms to integrate with Opsgenie for centralised end to end pipeline monitoring so issues can be identified and resolved 24/7 using documentation with minimal downtime.
- Deployed code using a GitHub repository, via GitHub actions and CloudFormation. Here we had actions that deployed to certian environments to Airflow, Snowflake depending on what folder files were added.
Actions were also used to deploy other AWS services such as Lambdas, SQS queues, SNS Topics and Subscriptions, Glue, S3 Event Notifications and EventBridge rules, CloudWatch Log Streams, Log Groups, Metrics and Alarms
- Helped begin testing on Databricks to compare performance to existing snowflake pipelines to prepare for an upcoming migration to Databricks in the next year.
|
Kubrick Group Ltd |
Data Engineer Trainee/ Consultant |
September 2022 - May 2025 : London, United Kingdom (Hybrid) |
- Trained to be specialised in Data Engineering including cloud-based technologies, which involved learning and developing advanced skills in Data Modelling, SQL, Python etc.
- Used and implemented SQL, Spark (via Databricks), Azure Data Factory, Docker Containers and other codes/services.
- Applied my skills on a project for AstraZeneca using DBT and Snowflake. Working as a part of an agile team to complete sprints and present deliverables to stakeholders.
|
DEFRA (UK Government Department) |
Admin Support Officer |
March 2021 - June 2022 : London, UK (Remote) |
- Using an Agile framework to help create certificates for various animals and animal products for EU and international countries from UK legislation so they can be imported to or exported from the UK
- Designed webpages for the .gov website to allow certificates to be used by consumers and exporters
- Working to strict deadlines due to Brexit timeframes meant working in a high-pressure environment to ensure certificates could be used by Importers and Exporters to the UK immedietly
|
The Cooperative Food, MidCounties Coop |
Customer Service Assistant |
September 2016 - July 2022 : Hazlemere, High Wycombe, UK |
- Managing the shop and taking responsibility of other employees during closed hours, indicating team leader skills as well as time management skills to complete work before night staff arrival.
- Serving customers at tills along with aiding them around the shop, thus improving customer relations as well as communication skills and marketing skills.
- Taking in delivery, stock taking and organising stock so that it can be put out onto the shop floor, highlighting organisation and involved working in a pressured environment to make sure that stock is out on display by a certain time
- Facing up so that products are appealing to customers, making sure that I made good attention to detail so that every product was organised and in the correct location.
|
Just Right Fundraising |
Door to Door Charity Worker Representing RSPCA |
September 2016 - July 2022 : Nottingham, UK |
- Managing the shop and taking responsibility of other employees during closed hours, indicating team leader skills as well as time management skills to complete work before night staff arrival.
- Serving customers at tills along with aiding them around the shop, thus improving customer relations as well as communication skills and marketing skills.
|
Leaf Catering |
Bar Staff |
January 2017 - December 2017 : Hatfield, UK |
- Bar work, delegating with peers to prepare drinks for when customers arrive indicating organisation skills, customer service skills and the ability to work quickly in a pressured environment.
|
R.Bensons |
Work Experience |
April 2015 (2 Weeks) : Chesham, UK |
- Responsibilities include health & safety checks and tests, receptionist, filing and general involvement in everyday office work and visiting customers.
- Attributes gained involve working first hand in high pressure situations and learning what skills working in such an environment are required.
|
Projects
Websites
CV Website - What you are currently looking at
A static webite to allow for an extended version of my CV, which includes more detail of my
previous experiences and detail on personal projects i have undertaken.
AWS Services Used: Route53, CloudFront, ACM, S3, CloudFormation
How It All Works Together:
- Upload Website to S3
- Your static website (e.g., index.html, style.css) is stored in an S3 bucket.
The bucket is not publicly accessible; instead, access is via CloudFront.
- Distribute via CloudFront
- CloudFront pulls content from the S3 bucket using Origin Access Control (OAC).
It caches the content at edge locations for fast global delivery.
You configure the default root object (e.g., index.html), and optionally add custom domain aliases.
- Secure with HTTPS via ACM
- You request an SSL certificate from ACM (must be in us-east-1 for CloudFront).
The certificate is attached to the CloudFront distribution for HTTPS support on your domain.
- Route Custom Domain with Route 53
- You configure Route 53 with an alias A record pointing your domain (e.g., www.example.com) to the CloudFront distribution.
This lets users access your website via a friendly domain name, not the CloudFront URL.
Example Flow
- User types www.example.com in browser.
- Route 53 resolves it to the CloudFront distribution.
- CloudFront serves the content. If cached, it's served instantly. If not cached, CloudFront fetches it from S3 (securely, using OAC).
ACM ensures the connection is encrypted via HTTPS.
- The browser displays the static site!
The image below displayes the process used in the movement of data from user submission to back end database/email send:
Online Shopping Website
Replication of an Online Shop where a user can purchase an item and recieve a confirmation email
whilst the data is inserted into a back end database where it can then be used for stock, order completion, customer trends or customer support etc.
This project showcases cloud skills by replicating an Online Shopping Order page which, once completed and submitted, will load the data
into a back end database (DynamoDB) which can then be used for stock and order completion, further analysis of customer trends and/or customer support etc. Whilst also
emailing the user to confirm their order, along with their order number so that they are aware that the order was successful and that their
purchase went through.
Code was deployed to AWS using GitHub Actions to allow for Automated deployment of AWS services, objects and roles to allow
for easier deployment of code to AWS and to utilise the full functionality of IAC.
AWS Services Used: API Gateway, Lambda, DynamoDB, S3, CloudFormation + Services used above for CV Website
How It All Works Together:
- HTML Page (Front End)
- The user visits your online shop in their browser
- They fill out an order form (e.g., select item, quantity, name, etc.).
- When they click “Submit Order,” the form triggers JavaScript code that sends the order data to your backend using an HTTP request (typically POST).
- API Gateway (Backend Entry Point)
- The order request is sent to Amazon API Gateway
- API Gateway acts as a secure entry point for your backend.
- It accepts the HTTP request and forwards the order data to your Lambda function.
- AWS Lambda
- The Lambda function receives the order data
- It runs your custom code to validate, process, and prepare the data for storage
- It then connects to your DynamoDB table
- Connects to SES to send email to customer, notifying them of their successful order
- DynamoDB
- Lambda inserts the order into your DynamoDB table
- DynamoDB stores the order details (e.g., order ID, customer name, product, quantity, timestamp)
- You can later retrieve or display these orders from the database for further analysis, process refunds etc.
The image below displayes the process used in the movement of data from user submission to back end database/email send:
To visit the website for the Online Shop, click here
An ETL pipeline used to showcase trends on ETF Index Funds using a dashboard whilst also providing the ability to
dive in and analyse induvidual stocks that are a part of the index selected
AWS Services Used: Python, Streamlit, Snowflake, DBT, CloudFormation
This project showcases skills already obtained from experience at Aer Lingus, whilst also using dbt to undertake some calculations
to demonstrate its usefulness. Essentially this project involves obtaining Financial data using Python (using the Yahoo Finance module).
From here data is then organised into a dataframe and specific columns are selected.
These dataframes are then implemented into an INSERT statement to insert the data into Snowflake. In this project a demo account
for snowflake was used which limits the automation that can be achieved. Ideally, this python file would run on something like MWAA
which would allow for updating/inserting the latest data on a schedule into Snowflake to increase the dataset available for analysis
in the dashboard to the business users upstream.
Code was deployed to snowflake using GitHub Actions to allow for Automated deployment of Snowflake Objects, Users and Roles to allow
for easier deployment of code to snowflake
- Features:
- Displays key metrics like YTD return, 1-month return, and 1-year return
- Interactive filtering by ETF ticker and sector
- Visualizations including line charts for price trends and simplified candlesticks
- Real-time data pulled from Snowflake and displayed using Streamlit
- Responsive dashboard layout with custom formatting and grid-based design
The image below displayes the process used in the movement of data from source to end user:
To view images of the final dashboard, click here
Future Projects - I Plan to do these in the Future
- More Cloud Engineering projects...
- Cyber Security
- Ethical Hacking?
Certifications and Achievements
- AWS Certified Data Engineer Associate
- Snowflake SnowPro Core Certification
- Databricks Certified Data Engineer Professional
- Financial Edge Micro Degree: Certificates in Accounting, Modelling, Valuation, FX Markets as well as Fixed Income, Commodities and Portfolio Management etc.
- Bloomberg Market Concepts: specialising in Currencies, Equity, Fixed Income and Economic Indicators.
- Bilingual: Fluent in English and Danish, competent in German.
- Own a UK Driver’s License.
Future Certifications I am in the process of obtaining/learning
- AWS Solution Architect Associate
- Azure Fabric Data Engineer
Education
Institution (inc Course) |
Qualification/ Grades |
Subjects |
Projects |
University of Nottingham - MEng Aerospace Engineering |
2:1 |
- Propulsion
- Advanced Propulsion
- Aerodynamics
- Advanced Aerodynamics
- Flight Mechanics
- Statics and Dynamics
|
- Dissertation - Using Computational Fluid Dynamics (CFD) on ANSYS Fluent to successfully recreate a real-life NASA experiment on fluid flow through nozzles. From here noise reduction methods are modelled on top of this experiment in a 2D model to measure turbulence and see if noise had been reduced.
- Unmanned Arial Vehicle (UAV) - Personal work included designing the UAV. Using JavaFoil and XFLR5 I designed the aerofoil and fuselage of the drone.
- Space Craft design for Asteroid Mining - Responsibilities included designing the internals of the asteroid mining system and also designing the docking mechanism for the spacecraft on 3DExperience so that the payload could safely be transferred to an earth orbiting satellite.
- Designed wings for a glider whilst taking into consideration weight, material, cost etc and then presenting to peers and industry experts our reason of choice etc
|
Chesham Grammar School |
A-Levels:
- Maths – A
- ICT – A
- Physics – B
- Psychology (AS) – B
GCSE:
- 1A*, 7A, 4B
|
See Grades Column |
N/A |
Hobbies and Interests
- I am a Semi Professional footballer, having played for Reading FC Acadamy and Wycombe Wanderers FC Acadamy in my youth
- 9 Handicap Golfer - Having represented my local club when i was younger (Harewood Downs Golf Club)
- Trialed for County Cricket as an Opening Batsman and Wicket Keeper, having played for the County Champions from the age of 7-15
- Fitness: I enjoy going on runs, cycling, swimming and going to the gym.
- Keen and experienced traveller who enjoys experiencing different cultures and meeting new people
- Music: I have DJ’d in some UK clubs, playing dance music as well as classics from the 70s and 80s