Responsibilities:
- Develop and maintain backend APIs and microservices using Python (FastAPI or similar).
- Work with data pipelines that handle large datasets using formats like Parquet and CSV.
- Integrate with cloud storage (e.g., AWS S3) and databases like MongoDB and Redis.
- Write efficient, maintainable, and well-documented code.
- Collaborate with cross-functional teams on design, development, and deployment.
- Contribute to improving performance, scalability, and reliability of systems.
Requirements:
- 2–5 years of professional experience in Python development.
- Experience with at least one backend framework (FastAPI).
- Strong understanding of REST APIs, async programming, and data serialization.
- Familiarity with databases (MongoDB, PostgreSQL, or Redis).
- Knowledge of AWS S3 or other cloud storage services.
- Hands-on experience with Git, Docker, and modern development workflows.
- Good problem-solving skills and attention to detail.
Nice to Have:
- Experience with data processing frameworks (Pandas, PyArrow, Polars).
- Exposure to ETL pipelines, streaming systems, or cloud deployments.
- Understanding of caching, API rate limits, and distributed system design.
Why Join Us:
- Work on modern backend and data systems used in production environments.
- Collaborate with a small, talented team focused on quality and scalability.
- Opportunity to learn and grow across data engineering, cloud, and backend development.