About Me

I’m Joe. A Lead Software Engineer with over 10 years of Experience.

Who Am I?

My Story

I was born in Dorchester (UK), along with my twin brother.

My career has grown exponentially, this is down to my strong drive to work hard and progress my personal situation and improve technology available to others.

Right from an early age I have been into technology and making money… When I was just 13, I started my own web design business. First with just basic HTML and CSS simple sites for friends and family, but I quickly grew it into a company which generated me an income on a monthly basis with clients all over the country, some with international ventures. This growth taught me how to really listen to every client’s requirements and needs.

As well as the web design, my younger years, along with school, was also filled with other scripting type applications. This got me excited and invested in the realization that if I can think it, with enough time and drive, I can make it. It truly is magic!

My career has already had many high moments, below are a few of my favourites:

April 2024: Meeting Minutes Generator

I took on the ambitious task of creating an end-to-end Meeting Minutes Generator, a sophisticated system capable of extracting valuable information from meeting transcripts without prior context about the meeting’s topic or attendees’ roles. This project involved a combination of natural language processing (NLP) techniques, data parsing, and automated document generation.

The Meeting Minutes Generator was designed to produce comprehensive meeting summaries, including attendance logs, action items, meeting minutes, and other relevant items crucial for post-meeting documentation and follow-ups.

The system employed advanced text analysis techniques to identify context cues, sentiment analysis to gauge the tone of discussions, and topic modelling to categorize discussions into relevant themes. This holistic approach ensured that the generated meeting minutes were not only accurate but also provided valuable insights into the meeting dynamics and outcomes.

Additionally, I integrated the Meeting Minutes Generator with existing collaboration tools and workflows, allowing for seamless integration into the organization’s productivity ecosystem. This included APIs for data input/output, automated notifications for action items, and customizable templates for generated documents to meet specific formatting requirements.

The success of the Meeting Minutes Generator was evident in its ability to significantly reduce manual effort in preparing meeting summaries while ensuring consistency and accuracy across diverse meetings and participants. It empowered teams to focus more on productive tasks rather than administrative overhead, driving efficiency and collaboration within the organization.

This project highlighted my ability to conceptualize and implement innovative solutions that leverage AI and automation to streamline complex processes and deliver tangible value in real-world scenarios, particularly in the realm of knowledge management and workflow optimization.

August 2023: LLM Connector API

As part of my role, I took on the responsibility of designing and building the LLM (Large Language Model) Connector API, which served as the central component for all GenAI (Generative Artificial Intelligence) interactions across all clients, for multiple models. This project required a deep understanding of both GenAI technologies and scalable API design principles.

The LLM Connector API was meticulously crafted to handle a wide range of GenAI requests, including natural language processing tasks, data generation, embedding, and model inference. I collaborated closely with data scientists and AI specialists to ensure that the API’s endpoints and functionalities aligned seamlessly with our GenAI models and workflows.

One of the key challenges was ensuring high performance and low latency, especially during peak usage periods. I optimized the API’s architecture and implemented caching and queuing mechanisms to reduce response times and enhance overall system responsiveness. Additionally, I integrated robust authentication and authorization mechanisms to safeguard sensitive data and control access to GenAI capabilities based on client permissions and roles, along with regional requirements.

The success of the LLM Connector API was evident in its widespread adoption by clients across various industries. It became the backbone of our GenAI platform, facilitating sophisticated AI-driven interactions while maintaining scalability, reliability, and security standards. Continuous monitoring and performance adjustments were integral to ensuring the API’s ongoing efficiency and effectiveness, leading to positive feedback from both internal stakeholders and external clients.

Overall, the design and implementation of the LLM Connector API underscored my ability to architect complex systems, integrate advanced technologies seamlessly, and deliver impactful solutions that drive innovation and value for businesses leveraging GenAI capabilities.

October 2020: First-Class Honours Bachelor’s Degree
The four years while I was working full time and doing the degree in the evenings and weekends was not easy, not by any stretch of the imagination, but it sure were fun! By making the most out of every opportunity given to me I was very busy, not really leaving much free time for my friends, family or myself, but it was worth it!

I ended up achieving a First-Class Honours Bachelor’s Degree in Digital and Technology Solutions with the university of BPP in London.

I achieved 100% in two of my modules, along with 8 of my modules achieving 80% or above.

My final dissertation project was on a Python Logger I developed, the report provided an analysis and evaluation of the current and prospective enhanced querying and reporting of telemetry of across the firm’s teams, followed by the development and deployment of a standardized Python logging package. Methods of analysis include researching the existing solutions already on the market, exploring options to appoint an external company to write a new system for the firm, or investigate if creating the new package internally is feasible, as well as the relationship between different employees and their experiences and dependencies on logging within their role.

Other calculations included, rates of return based on improvement to the reporting and querying capabilities for Python application telemetry within the first six months, amongst others. Results of data analysed show that the firm was not using the meta-data produced in the already existing logs to their full advantage. In particular, comparative querying performance was poor, causing up to 3 hours a week wasted per employee while looking through un-organised and sub-optimal log entries.

The development of the package also included designing and implementing a full Continuous Integration and Continuous Delivery/Deployment (CI/CD) pipeline, which permits artefacts to be deployed into a production environment, all while being automatically tested, scanned and compliant with all firm and regulatory requirements.

This report found the prospects of the firm in its old position were not positive. The major areas of weakness required further investigation and remedial action by application developers and their surrounding teams. Recommendations discussed include:

  • Producing a standardized Python logging package bringing the opportunity for enhanced querying and reporting of telemetry which can be achieved by using a JSON format output.
  • Knowing the end user will help to reduce the unnecessary logging, allowing for a higher value-add, all at a reduced cost to resources and time.
  • Features required from different lines of business in the future will need to be refactored into the package accordingly and correctly.

I attained a staggering 96% for this report, which I was over the moon about!

November 2021: Blue Prism Remote Digital Worker Communications

Modernizing Legacy Product Architecture and Procedures

A key milestone in my career was the successful modernization of a legacy product architecture and procedures, where I single-handedly defined and executed a strategic plan. I conducted a thorough analysis of the existing architecture, identified pain points, and developed a roadmap for modernization.

This initiative involved migrating monolithic components to microservices, adopting containerization with Docker and AWS ECS for orchestration, and implementing CI/CD pipelines for streamlined deployment and testing. These efforts significantly improved scalability, reliability, and performance, setting a new standard for our product infrastructure.

 

Designing Communication Channels for Remote Digital Workers

A critical challenge I tackled was enabling seamless communication between remote digital workers and on-premises systems. Leveraging my expertise in RESTful APIs and event-driven architecture, I designed and implemented robust communication channels using technologies such as RabbitMQ.

I developed a hybrid architecture that allowed remote workers to securely transmit and receive data, leveraging REST APIs for synchronous interactions and RabbitMQ for asynchronous, event-based communication. This solution not only improved data transfer efficiency but also enhanced system resilience and fault tolerance.

 

Ensuring Scalability and Flexibility

My focus on scalability and flexibility was evident in the architecture I designed. I implemented load-balancing strategies, auto-scaling mechanisms, and efficient resource utilization to ensure optimal performance under varying workloads.

By decoupling components through messaging queues and defining clear communication protocols, I achieved greater modularity and extensibility. This allowed for seamless integration of new features and services without disrupting existing functionalities, supporting the organization’s growth and innovation objectives.

 

Continuous Improvement and Monitoring

Post-implementation, I established comprehensive monitoring and analytics frameworks using tools like Prometheus and Grafana. This enabled proactive monitoring of system health, performance metrics, and event-driven workflows, empowering me to identify bottlenecks, optimize resource utilization, and address issues in real time.

Regular performance tuning, code reviews, and self-driven learning sessions were integral to my continuous improvement efforts. By fostering a mindset of continuous learning and innovation, I ensured that the modernized architecture remained agile, resilient, and aligned with industry best practices.

August 2021: Dorset to Wales and back again in 24 hours

During lockdown many people found new interests and hobbies to keep themselves occupied. For me it was cycling, an activity that not only allowed me to get out of the house, but also to clear my head. A few months into lockdown I had joined a local cycling club (Elmer Cycle Club, or ECC for short). We had decided to bike to Wales and back in just 24 hours to raise money for Weldmar Hospicecare.

Both my Dad and Grandad were cared for by Weldmar, and so supporting the charity is very important to me. Weldmar were amazing, my Grandad was so relaxed while he was there. The nurses were so nice and supportive. With my Dad, it was so good to know that he was being well cared for. And in his last ten days, we were able to be constantly with him by his side.

Other families’ friends have been cared for by Weldmar and I know they’ve always had amazing experiences like we did. Weldmar just takes the pressure away.

The challenge was great fun, and even after putting in months’ worth of training the ride was tough. It took just over 18 hours in the saddle total, and not sleeping for 24 hours, with a days work before, was hard.

The ride started on Friday 20th August at 8pm, where I cycled through the night, arriving in Wales early the next morning. After a quick stop to refuel, we begin the journey back to Poole! I raised about £5,000 for the charity during the challenge.

June - July 2019: My Authentication Package

The firm was rolling out a new SSO (Single Sign On), password-less authentication system, which meant application developers needed to migrate their current authentication methods over to this new system. This obviously was no small task, and to correctly verify, authenticate and validate users using the new system was vastly different to the previous access-control tool that was in place. I saw that with countless applications needing to make this change (near 1,500 lines of code per application), it would cause issues.

So I created a Python package which is an intermediary between the application and the authentication tool. This standardised the authentication process for application teams, which only took them 5 lines of code while using my package, increasing security with above industry standard authentication and decreasing technical debt. It quickly became the firm’s strategic standard for Python authentications to use my package.

Following from the success of the SDLC pipeline (below), I had gained many valuable connections while over in the states, and again I was asked by them to fly out and give demonstrations on how teams could implement my package to their applications. To date (June 2021) the package has been downloaded over 500,000 (500K) times, and currently is being called 2,500,000 (2.5M)+ times a week.

October 2019: IBM TechU Prague

Myself and a few others in my team were invited and flown out to attend IBM’s TechU hosted in Prague. With world-class techies giving us detailed sessions on topics such as Artificial Intelligence, Machine Learning, Big Data, Cloud Architecture and many more.

All of the speakers were and are best in class, world experts, with the likes of IBM’s Wolfgang Bosch who, at the time, was the Business Development Executive for Watson and AI Innovation. Many of the sessions were very personal with small audiences, sub 20 people, so we could really dive deep into the topics with specific questions and answers, with many 1:1 follow up sessions later into the evenings. A truly monumental and insane experience.

2018 - 2019: OS Build Automation

There was a business need to improve the time to market on the provisioning of new operating systems for an accelerated computing offering. At the time the system that was in place could take weeks and in some cases months to complete, with a success rate of almost zero since every order would require manual intervention.

I originally wrote a suite of APIs which communicated with our estate management nodes, the firm’s compute ordering tool and the firm’s build pipeline orchestration tool. This new solution has been operational since, with almost daily production releases, with new features released to our clients with our continuous innovation, improving our customer’s user-experience.

The provisioning times went down from weeks/months to just hours, with it being fully automated end-to-end. Removing the need for manual intervention as a negligible number of builds require attention, and those that do are automatically notified to the team without the client even knowing there was an issue.

A quote from my manager at the time:
Not everyone fully understands the work delivered beyond the headline “OS build automation”. So much additional work performed by Joe is not visible outside of the team. The properly formed APIs, the right error codes, the unit test coverage, the code quality, the JIRA usage and tracking, the self-management and discipline to deliver quality, the Jenkins pipeline and what it means for automated code deployment and code quality, production release process, security authentication on boarding etc. All these additional tasks taken on and owned, to deliver this solution that worked from day 1. By using the knowledge, he previously had and what he has gained, he uses to educate others, and has become a Subject Matter Expert that many others look to.

February 2019: Firebase Python Package

I wrote the Python package ‘firebase‘ which is a Python interface to the Google’s NoSQL Database (Firebase) REST API offering.

Currently (as of June 2021) the package is downloaded over 35,000 (35K) times a month, with a total of over 350,000 (350K) downloads, up to date stats can be found here.

Effectively the package converts a user’s NoSQL database into a Python readable object, with simple data manipulation, it can also handle complex queries such as ordering (even though its inherently a nonstructured database schema).

October 2018: JPMC's First Python SDLC Pipeline

Earlier on in the year I (along with 1,000s of other developers) had noticed that a fully automated Jenkins pipeline for Java was implemented internally for us, however there was no fully complete end-to-end Python solution. I was writing a suite of Python REST APIs at the time, releasing new features into production on a some-what (as much as I could) regular basis. However, the major blocker for my innovative juices was from lengthy wait times from other teams testing, scanning and approval of my code. I took the opportunity to write the firm’s first fully end-to-end Python SDLC pipeline, it was written in Groovy and ran on Jenkins, just like the Java one.

Effectively it automated the build, testing, scanning and deployment of an application, and its subsequent new features into a production environment, automatically meeting the firm’s and regulatory requirements via the automated scans and associated “toll-gates”. All of which would happen automatically from the point a developer was to push their code into a SCM (Source Code Management System, for example GIT), which they would have to be doing anyway.

For a firm with over 40,000 technologists, this huge time-save was a big deal. I was flown over to New York and New Jersey and gave presentations to 100s of the firm’s most senior developers. By showing them my work and implementation strategy, they too could utilise my pipeline for their applications. Saving them and the firm time, increasing value-add throughput, lowering code smell, all while increasing their applications security and improving their time to market.

July 2018: Round The Island

I was sponsored to be put on one of the Clipper boats, which sailed around the Isle of White while raising money for the Ellen MacArthur Cancer Trust. The event was amazing, there are some photos here for those who are interested.

September 2017: Hackathon

I was part of a team which developed a system and method for implementing a client sentiment analysis tool. We did the initial development within just a 12 hour stretch in a Hackathon. This later lead to me becoming a recognised inventor by the US Patent & Trademark Office in July 2018 (20190220777).

July 2016: Fully Paid-for Degree Offer

In 2016 I was offered a job at J.P. Morgan Chase & Co in Bournemouth, to be part of their initial intake for their Technology Degree Apprenticeship. Of-course I was over the moon and accepted.

This really sling shot my exposure to other senior software developers and engineers, while improving my skills I was still able to pass on my experiences with them.

June 2013: ‘Think Big’ European Youth Development Programme

I was one of the 20 Business students selected to attend this event that was sponsored by O2 to build confidence, entrepreneurial spirit, and digital skills in young people.

The session comprised of interactive activities to help us gain entrepreneurial skills and a hands-on chance to learn about digital technology and the ideation process. Group ideation, developing skills in problem solving, teamwork and creative thinking were the core elements of the Think Big event.

We worked in mixed groups and were encouraged to produce fresh ideas and apply a digital solution that solves a particular problem in our lives. I had the opportunity to interact and pose questions to Telefonica employees, who too were part of the activities.

December 2012: My First Website

This is where it all started… Tilsed.com

My Approach

My current model comes with a deep knowledge of design, analytics, development, coding, testing and application programming.

Specialized in in GenAI LLM Prompt Engineering, Python REST APIs, with an exceptional understanding of CI/CD (Continuous Integration and Continuous Delivery) tools such as Jenkins, and how to utilize them correctly.

I have a great depth of knowledge of how to implement front-end, back-end and middleware technologies in enterprise environments.

Experienced in various legacy and modern technology domains and practices, giving me an insight to how to improve old systems, and how to create new solutions to last decades.

I can solve extremely complex, mission critical problems, in high pressure situations within a timely manner, while still being able to share my strategic direction and understandings with others clearly.

Throughout my career, I have led teams in developing robust and scalable software solutions that have significantly positivly impacted business operations and user experiences.

I Optimize Businesses of Every Kind, Every Size.

Are You Next?

 

Get In Touch

Ready to Chat?

Contact Me Directly Below