Location: Jersey City,NJ, USA
Streamvector Inc. d/b/a Sigmoid Analytics seeks to hire multiple Technical Leads responsible for performing the following duties: 1.Expand and improve the SigView platform by developing, architecting, and building new features, software, and applications on top of the platform to specifications required by specific projects.2.Collaborate closely with clients to gather the requirements and participate in scoping meetings to translate the business problems to feasible technical solutions.3.Manage large teams and ensure successful delivery of projects to the customers by providing low latency and high-quality products.4.Design, build, install, configure, and support big data tech stack using Amazon Web Services (AWS), Apache Kafka, Java, Spring Boot, Micro Service Architecture, Novell IDM, OAuth and SAML.5.Create and implement resilient and scalable data architectures, encompassing both High-Level Design (HLD) and Low-Level Design (LLD).6.Ensure data quality, reliability, and performance to handle processing of petabytes of data by utilizing tools including Apache Airflow, Prometheus, Great Expectations, Terraform, Docker, and Grafana.7.Work in close collaboration with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand and address issues related to data.8.Oversee client communication and employ Continuous Integration/ Continuous Deployment (CICD) processes for continuous delivery.9.Implement frameworks and strategies to safeguard personally identifiable information (PII) data for security compliance requirements including GDPR, SOC 2, and CCPA.10.Use tools including Big Query and Azure Data Factory and apply Artificial Intelligence and Machine Learning algorithms to develop robust pipelines for petabytes of data.The candidate must have a Bachelor's (or foreign educ. equiv.) Degree in Computer Science, Software Engineering, Electronics Engineering, or a related field plus five (5) years (post degree, progressive) experience in the job offered or related. Experience must include the following skills: a)Manage large teams and ensure successful delivery of projects to the customers.b)Implement robust data pipelines to reliably manage petabytes of data using Apache Airflow, Prometheus, Great Expectations, Terraform, Docker, and Grafana.c)Support big data tech stack using Amazon Web Services (AWS), Apache Kafka, Java, Spring Boot, Micro Service Architecture, Novell IDM, OAuth and SAML.d)Create and implement data architectures encompassing both High-Level Design (HLD) and Low-Level Design (LLD).e)Employ Continuous Integration/Continuous Deployment (CICD) processes.f)Apache Spark, Google Cloud Platform (GCP), MongoDB, Databricks, Snowflake, BigQuery, and Azure Data Factory.g)Application of artificial intelligence and machine learning algorithms.Benefits include sick & vacation pay.