Jobs
Ready2Use S.r.l.
JavaScala 3TypescriptPythonElixirAzure PipelinesGitHub ActionsJenkinsMySQLSpring BootLinuxShellDockerANTLRKubernetes
From: 2024-06
To: Current!
Technologies used
Java
Scala 3
Typescript
Python
Elixir
Azure Pipelines
GitHub Actions
Jenkins
MySQL
Spring Boot
Linux
Shell
Docker
ANTLR
Kubernetes


Summary

My role at Ready2Use as a Java Developer was definitely full of interesting projects and working for completely different stacks from my previous job.

New stacks, new problems, new skills!

It’s better to see many different things at the start of the career to grow transversal skills.

Now, onto the things I’ve worked on!

Workflow Java 21 monolitic-ish application (with Camunda)

I’ve started working on a semi-monolithic application in Java 21 to manage an approval workflow to be used internally by the customer.

There was heavy usage of Camunda 7, with its behaviour modified to account for a more robust usage, like validating the process initialization payload based on the starting form because by default Camunda does no validation.

We also encountered some limitations related to computing values dynamically inside forms which I was tasked to find a solution on.

A custom language (or DSL) was needed to define operations on form fields and/or constants.

My first attempt was create a recursive descent parser in Scala 3.

The reason I chose Scala was because I could write the same logic and use it both backend and frontend since I could target both Java (since by default Scala targets the JVM) and JS thanks to ScalaJS.

But for different reasons related to the frontend implementation (with Angular), we later had to discard this option after developing a PoC.

The second solution I proposed was defining a grammar via ANTLR and then implement the parser logic in both Java and TypeScript.

This was a sub-optimal solution since I had to write the logic both times, first in Java and then in TypeScript while also being careful in checking that the behaviour was coherent in both implementations.

Since the result was well defined, for both parsers I used a TDD approach.

VDS configuration for development purposes on multiple projects & teams

I was also tasked to setup a VDS to be used for development purposes.

I chose to used NGINX as a reverse proxy to allow the deployment of multiple projects (some deployed on a subpath, others with their subdomain), with my suggestion to deploy only Dockerized apps and avoid bare metal installations where possible, to both have reproducible installations and also keep the VDS clean.

I was given a FQDN to which I used certbot and a crontab to keep the certificate updated.

VDS configuration for a self-hosted instance of KMS

There was also another VDS where I had installed a KMS (Vaultwarden) to be used internally, configured with fail2ban and other security measures and scheduled backups (same as the development server) to prevent unpleasant headaches in the future.

Automatic code scan for any project of any stack (with Mend)

This one was very interesting, a (big) customer had asked for a way to check for vulnerabilities in various projects of theirs, while also having a custom workflow which required multiple actors to allow or deny the deployment of a new version of any of their projects.

Many stacks were involved in their projects: Java, Python, PHP, JS (frontends), C# and so on.

They used Mend as a tool but also want a 3rd party to review the results and give an opinion of their own.

The problem was that we needed an environment to also build the projects, which is difficult enough when managing so many stacks but even worse if you consider that different projects also had different version dependencies (Java 11-17-21, Python 2-3, etc…).

My proposal was to use the previously developed Workflow application (which we carefully developed as agnostic as possible) to manage a process which could start scans while also receive the 3rd party opinion from on each vulnerability found.

To manage the various requirements of the different projects, I asked (where not already present) to create a Dockerfile which job was to simply correctly build the project (if it makes sense) and add some metadata via labels to the whereabouts of the project files.

From this docker file I would build and image and extend it with another image I designed to scan the project with mend and output the file report.

It had the minimum hard requirements as possible on the base image: have a particular set of labels, have glibc and curl.

When the entire process was started from the workflow, a selfhosted Jenkins pipeline would then start and manage all the builds needed to get the scan result.

Also interesting to see how Jenkins allows to create powerful pipelines thanks to Groovy.

Health insurance microservice-ish application in Java 21/11

After some time, I got moved to another project in consultancy for a project (hosted on GitHub) which managed health insurance policies with the backend written in Java 11 (majority of modules) and 21 (a small subset of modules).

This project was a very good gym to tackle a big project which had development AND organizational problems.

Trying to understand what an intricate code did or refactor already existing code to solve performance problems while also deal with parts of code you didn’t even know existed completely break because you changed a (apparently) completely unrelated line of code.

I’ve also found out finally how useful git bisect can be to pinpoint regression problems.

Smaller developments and honorable mentions

🐍 Stress test project with Python and Locust to check what load that could an internally developed chatbot manage.

💧 CSV parser with Elixir that could filter some specific type of rows (a personal exploration of Elixir, it was not a requirement).

DigitalSoft S.r.l.
Python3ECMAScriptRustC#Azure PipelinesSQL ServerSQLiteFlaskFastAPIGoogle OR-ToolsLinuxShellDockerOPC-UA
From: 2022-02
To: 2024-06
Technologies used
Python3
ECMAScript
Rust
C#
Azure Pipelines
SQL Server
SQLite
Flask
FastAPI
Google OR-Tools
Linux
Shell
Docker
OPC-UA


Summary

My role at DigitalSoft began as an OR Python Developer and was heavily influenced by my studies and thesis.

Since this was my first job, I tried to learn as much as I could.

Let’s begin!

Production plan optimizer

The first problem I was tasked to solve was to create a production planning optimization algorithm generic enough that could be useful to all customers, but also with enough specialization that its result was non-trivial.

Definitely a colorful start!

I chose to approach this problem by modeling it as a Mixed Integer Linear Program (MILP) and create a library of the most common constraints asked by customers that could also be in part customized. A very helpful library that helped my development process is Google OR-Tools, check it out.

It was a backend Python application which exposed APIs to each optimizers and required the data to fill the models with. I first started with Flask but after searching a bit more I settled with FastAPI which helped on some features I needed (some of which were strongly related to API documentation with OpenAPI).

More on this can be read by check my thesis because the development took months and it would be too long to write here. (Italian only for now, sorry!)

Wait, I’ve talked about optimizers? Yes, because…

Production execution optimizer

Together with this, I also worked on a production execution Gantt optimizer using Constraint Programming (CP) with Block Zone constraints.

I do not have a thesis on this one but much can be found by searching the Flexible Job Shop Problem.

But to have an idea of an “easier” subproblem, check out the Job Shop Problem example

The Block Zone constraint was a requirement where a user could setup some time duration in which any kind of production must not happen (imagine like a production line maintenance).

IIoT integration and custom connector

After a while, I worked as the main developer and maintainer of the Python IIoT integration of the main product with production lines PLCs, the majority of them communicating with OPC-UA.

Most of the integration was done thanks to Thingsboard which allowed to store and process data (with ECMAScript).

My task was to work with customers to check what significative data they wanted to monitor from IIoT devices.

Soon we found some problems with specific devices which required me to develop a custom OPC-UA connector (in Python).

Smaller developments and honorable mentions

Sporadic main project C# bugfixing

Python CLI tool used to compare MSSQL databases schemas to check for anomalies / customizations when the standardization of the product was in progress.

Rust CLI tool that calls multiple APIs to create a list of IIoT devices and its respective monitored variables and export it as a CSV.

Rust Server for testing purposes which had 2 endpoints, the first one that worked as an “echo” by returning the body that was sent (with the same mimetype) and the second one which simulated a 500 Internal Server Error. This last tool was Dockerized in an extreme way to see how small could a Docker image be (the result was 2MB).

Studies
  • UNIVAQ
    Bachelor's Degree
    2022-10
    Computer Science
    Thesis here!
    Final score
    99/110
  • I.I.S. A. Volta
    Graduate
    2016-07
    Computer Science and Telecommunication
    Final score
    92/100