Job Description
Company Description
Are you a Senior Software Developer passionate about building scalable, high-performance automation solutions? Join us at Sigma Software and work on a project that replaces manual processes with robust, Python-driven pipelines.
This is a full-time role open to specialists based in Ukraine, Europe, Armenia, Georgia, and Uzbekistan. You will be part of our Data & Cloud Solutions unit, collaborating remotely with a highly skilled team.
At Sigma Software, we value innovation, ownership, and technical excellence. Why join us? You’ll have the opportunity to work with cutting-edge technologies, contribute ideas, and see them implemented in real-world solutions impacting thousands of businesses globally.
CUSTOMER
Our customer is ConnectWise, a US-based software company specializing in IT management solutions for Managed Service Providers (MSPs). ConnectWise offers a suite of tools helping MSPs deliver services such as remote monitoring, cybersecurity, backup/restore, and administrative management for small and medium-sized businesses. Founded in 1982 and headquartered in Tampa, Florida, the company serves thousands of MSPs worldwide and is recognized for its focus on automation, integration, and enabling service providers to scale efficiently.
PROJECT
You will be working on an Automated Storage Capacity & Lifecycle Pipeline — a system designed to predict storage capacity risks and hardware End-of-Life (EoL) impacts across distributed environments. The project aims to replace manual, Excel-based reporting processes with Python-driven ETL pipelines, integrate with monitoring systems, and ensure data is aggregated, normalized, and stored for historical tracking.
Job Description
- Design, develop, and maintain Python-based ETL pipelines for infrastructure and capacity data
- Implement robust data access layers using SQLAlchemy and psycopg2
- Integrate with monitoring and metrics systems (Prometheus / VictoriaMetrics) and inventory sources (NetBox)
- Develop aggregation and normalization logic for capacity data across regions and node types
- Persist and manage historical datasets in PostgreSQL
- Collaborate with Data Engineers and stakeholders to define forecasting and reporting requirements
- Improve reliability, performance, and maintainability of automated pipelines
- Replace manual, Excel-based workflows with scalable, auditable, automated solutions
Qualifications
- At least 5 years of professional experience in Python development, including data processing and automation
- Strong hands-on experience with SQLAlchemy and psycopg2 for building and maintaining database interaction layers
- Solid background in working with SQL and relational databases, preferably PostgreSQL
- Practical experience with monitoring and metrics systems such as Prometheus or VictoriaMetrics
- Proven ability to take ownership of solutions from design through to production delivery
- Excellent communication skills for collaboration with both technical and non-technical stakeholders
- Upper-Intermediate level of English
WILL BE A PLUS:
- Experience with data visualization tools such as Grafana, Kibana, or PowerBI
- Familiarity with NetBox or other CMDB/inventory management tools
- Knowledge of storage systems, capacity planning, or infrastructure lifecycle management
- Background in working with time-series data or forecasting-related solutions
- Proven track record in designing or maintaining ETL pipelines in production environments
- Understanding of data aggregation, normalization, and historical tracking processes
- Experience with cloud or hybrid infrastructure environments
Additional Information
PERSONAL PROFILE:
- Self-driven and proactive
- Strong analytical and problem-solving skills
- Comfortable working in collaborative, cross-functional teams
- Able to communicate complex technical ideas clearly





