Damian Skrzypiec

Software Engineer · Data Engineer
+48 509-777-939
info@dskrzypiec.dev
dskrzypiec.dev
ssh -p 2222 cv.dskrzypiec.dev
Work Experience
Founding Engineer 09.2024 – present
ConnectyAI
Startup
  • Core ConnectyAI Backend in Go
    • Design and implementation of REST API endpoints
    • Developing core product features: Optimization Guides, Business Goal resolution (DAG-based), Dashboards, and a cron-based Workflow scheduler
    • Implementing permissions system, org plan management, and Auth0 M2M public API
    • Integrations with AI agents
  • Implementing data connectors for multiple warehouse types
    • Full support for BigQuery, Snowflake, Postgres, AWS Athena, Databricks, and AWS Redshift
    • Object listing, column listing, data sampling, statistics, and query history per connector
    • Postgres-based concurrency limiter with FIFO fairness for rate-limiting client connections
  • Building and maintaining data sync infrastructure
    • Day-0 sync process, catalog sync, and Query History sync pipeline
    • Event-driven Events Processor on AWS Lambda + SQS with progress tracking
    • Virtual Objects/Columns detection and cross-workspace enrichment copying
  • Infrastructure using AWS CDK
    • Migration of Aurora PostgreSQL clusters to encrypted instances using DMS and pg_restore
    • Setup and maintenance of ECS services, Lambda functions, SES, and bastion hosts
Software Engineer 02.2023 – 08.2024
Point72
Risk Technology
  • Developing new distributed platform for computation and analytics for Risk team in Point72
    • Developing and maintaining platform infrastructure based on Ray and AWS EKS
    • Developing core Python library which is a gateway to the new platform
    • Implementing Ray actors life cycle management
    • Working on performance improvement of the platform
  • Introducing Apache Airflow as a new scheduler in Risk Tech using AWS MWAA
    • Setup MWAA infrastructure via Terraform
    • Setup integration with other internal systems — SQL Server and Ray cluster on AWS EKS
    • Implement customized local setup of Airflow using Docker
    • Implement ETL processes preparing data for the new platform, using Airflow DAGs
    • Design the release process and CI/CD pipelines
Senior Data Engineer 08.2022 – 01.2023
TCL Research Europe
Cloud Team
  • Developing Overseas Big Data Platform for TCL
    • Development and maintenance of data pipelines — Spark (Scala) with AWS Glue jobs
    • Development and maintenance of AWS infrastructure (8 global regions) using AWS CDK with TypeScript
  • Performing infrastructure code migration from AWS CDK v1 into AWS CDK v2
  • Maintenance of tracking platform which handles events in real-time using data mesh-like approach
  • Development of cross-cloud ETLs — AWS to AWS China and AWS to Alibaba Cloud (Aliyun)
  • Development of end-to-end tests for Spark (Scala) transformations
  • Development and maintenance of AWS Lambda in Python
Senior Backend Developer 01.2022 – 10.2022
Territory Read
Startup (part time)
  • Backend implementation of a web app in Go
    • Implementation of user authentication and authorization
    • Implementing login using OAuth2
  • Design and implementation of ETL process for integrating third-party database serialization
    • Loading from AWS S3 into Go program inside AWS Lambda with asynchronous writes into AWS Aurora
  • Implementation of a search engine using combination of internal database and external HTTP API
  • Implementation of communication between backend and frontend using WebSocket protocol
Senior Data Engineer (C#) 08.2021 – 07.2022
Allegro Pay
DEA
  • Maintenance of the core analytical databases (Snowflake)
  • Design and implementation of data pipelines (Apache Airflow)
  • Design and implementation of the process for loan evaluation
    • Set up Google Composer infrastructure
    • Design data model schema
    • Implementation of Apache Airflow DAGs for the process
    • Design reporting and communication with other processes
  • Consult and review of C# services regarding data models and Snowflake
  • Implementation of analytical data model based on technical events using Snowflake and dbt
  • Integration of all historical data on delivery addresses from Cassandra and MongoDB into Snowflake using Go
Senior Backend Developer (C#) 05.2020 – 07.2021
PwC
Data Tools
  • Maintenance and development of an IFRS 9 application for an international bank
    • Implementation of new features on the backend (C# and SQL Server)
    • Improvement of the key processes of the application
    • Development of end-to-end tests
    • Development of CI/CD processes (TeamCity and Octopus)
  • Implementation of .NET Core backend prototypes for web apps
  • Guiding and performing code reviews for junior devs
Senior Data Scientist 10.2019 – 04.2020
PwC
Financial Risk Management
  • Design and implementation of an application for credit risk models validation
    • Designing the application database schema (SQL Server)
    • Backend implementation (HTTP server in Go)
    • Communication with analytic layer (R) and the frontend (Angular)
  • Design and implementation of an IFRS 9 application for a factoring company
    • Managing a four-people team
    • Designing the database (SQLite)
    • Implementation of estimation, calculation and reporting layer
  • Design and implementation of tools (R) for building behavioral models in market risk
  • Internal workshops — Go language (8h)
Senior Consultant 10.2017 – 09.2019
PwC
Financial Risk Management
  • Designing and leading development of internal tools for modelling
  • Implementation of automation estimation of parameters for credit risk models
  • Implementation of calculation stress tests for expected credit loss
    • Designing the database schema (SQL Server)
    • High performance calculation engine in Go
    • Communication with analytical layer (R)
  • Testing (unit and end-to-end) monolith IFRS 9 (C#) application
  • Internal workshops — git (2h), R (20h) and an introduction to HTTP protocol (2h)
Consultant 02.2015 – 09.2017
PwC
Financial Risk Management
  • Building and implementing models for credit risk
  • Use machine learning to improve parts of credit risk models
  • Algorithms performance optimization (C# and T-SQL)
  • Data analysis, reporting and validations (R)
Intern & Junior Analyst 05.2014 – 01.2015
Orange
Business Intelligence
  • Developing a prototype of real-time application using geodata of clients (R, Shiny)
  • Developing an application in VBA for periodical reports with data visualization and stats