Big-Data Full Stack Engineers

Remote USA Only Published 3 weeks ago

Big-Data Full Stack Engineers

Location : 100% remote

Long term contract


  • Integrate data pipelines from data warehouse (SQL Server) to Big Data (MongoDB) platform using Apache Kafka data streams
  • Design, develop and unit test new or existing data integration solutions to meet business requirements
  • Develop best practices for data integration/ streaming
  • Design and develop data integration/ engineering workflows on Big Data technologies and platforms
  • Develop workflows in any cloud environment (AWS, Azure, Google Cloud Platform)
  • Develop dataflows and processes for the data warehouse using SQL.
  • Develop data integration workflows using Web Services (REST/SOAP), XML, JSON, flat file format.
  • Create and maintain optimal data pipeline architecture
  • Leverage Microservices for joining data streams/ pipelines.


  • 6+ years total experience
  • 3+ years with a data streaming platform (Apache Kafka preferred)
  • Experience with SQL, ETL, Web Services and Microservices
  • Exposure to Change Data Capture (CDC)
  • Proven experience integrating enterprise software
  • Bachelor s degree in Computer Science, Engineering, or related field from an accredited university


  • Experience with Big Data technology
  • Knowledge of DevOps practices and tools

NucleusTeq culture

Our positive and supportive culture encourages our associates to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture where our people excel and lead healthy, happy lives.