
AgileEngine
AgileEngine está com vaga(s) de emprego para Data Infrastructure Engineer Id35383 Para Salvador em Salvador / BA

Cargo:
Data Infrastructure Engineer ID35383 – Salvador – AgileEngine
Requisito:
Join to apply for the Data Infrastructure Engineer ID35383 role at AgileEngine .
AgileEngine is one of the Inc. 5000 fastest-growing companies in the US and a top-3 ranked dev shop according to Clutch. We create award-winning custom software solutions that help companies across 15+ industries change the lives of millions.
If you enjoy a challenging environment working with top talent and are encouraged to learn and experiment daily, this is the place for you!
What You Will Do
- Architect, build, and maintain modern real-time and batch data analytics pipelines;
- Develop and maintain declarative data models and transformations;
- Implement data ingestion integrations for streaming and traditional sources such as Postgres, Kafka, and DynamoDB;
- Deploy and configure BI tools for data analysis;
- Collaborate with product, finance, legal, and compliance teams to build dashboards and reports supporting business operations, regulatory obligations, and customer needs;
- Establish, communicate, and enforce data governance policies;
- Document and share best practices regarding schema management, data integrity, availability, and security;
- Protect sensitive data through secure permissioning, data masking, and tokenization;
- Identify and communicate data platform needs, including tooling and staffing;
- Work with cross-functional teams to define requirements, plan projects, and execute them.
Minimum Requirements
- 5+ years of engineering and data analytics experience;
- Strong SQL and Python skills for complex data analysis;
- Hands-on experience with automation tooling and pipelines using Python, Go, or TypeScript;
- Experience with data pipeline and warehouse tools like Databricks, Spark, or AWS Glue;
- Experience with infrastructure-as-code using Terraform;
- Proficiency in declarative data modeling and transformation tools like DBT;
- Familiarity with real-time data streaming technologies such as Kafka or Spark;
- Experience with data orchestration platforms like Airflow;
- Background in cloud-based data lakes and secure data practices;
- Ability to work independently and drive projects end-to-end;
- Upper-intermediate English proficiency.
Preferred Skills
- Familiarity with container orchestration (e.g., Kubernetes);
- Experience managing external data vendors;
- Exposure to Web3 / Crypto data systems;
- Experience working cross-functionally with compliance, legal, and finance teams;
- Leadership in data governance or permissioning frameworks;
- Strong focus on simplicity, speed, and avoiding overengineering.
Benefits
- Professional growth: Mentorship, TechTalks, and personalized development plans.
- Competitive compensation: USD-based salary, education, fitness, and team activity budgets.
- Exciting projects: Work with Fortune 500 clients and top-tier brands.
- Flextime: Flexible work hours, remote and in-office options.
Your application process continues via email and our Applicant Site. Please complete your registration to proceed.
#J-18808-Ljbffr
Salário:
A combinar
Benefícios:
Não foi informado