Launch your career in the world of AI...
Are you ready to be part of an innovative, rapidly-growing AI startup?
At Paradigm, we're on a mission to harness the power of artificial intelligence to create groundbreaking solutions, and we're searching for passionate, talented individuals to join our dynamic team. If you thrive in a fast-paced environment, are eager to make an impact, and share our commitment to continuous learning and improvement, then Paradigm is the perfect place for you.
Apply today and become a driving force in shaping the future of technology!
Key Responsibilities:
- Design, construct, install, and maintain large-scale processing systems and other infrastructure.
- Build and manage complex ETL pipelines to ensure efficient data movement and cleansing.
- Create and maintain scalable and optimized databases, ensuring data consistency and integrity.
- Develop backend functionality to retrieve data from databases efficiently.
- Implement database alerts to monitor for undesirable behaviors and ensure system health.
- Conduct testing and quality assurance to ensure high reliability and performance of data resources.
- Implement data security measures to safeguard sensitive information.
- Collaborate closely with data scientists, analysts, and other stakeholders to support their data infrastructure needs.
- Stay updated with the latest industry trends and technologies to ensure our systems are current and competitive.
Required Skills and Qualifications:
- Bachelor's degree in Computer Science, Engineering, or a related field. Master’s preferred.
- Minimum of 2 years of experience in a Data Engineering role.
- Strong proficiency in SQL and experience with relational and NoSQL databases like PostgreSQL, MySQL, MongoDB, etc.
- Expertise in ETL tools and processes, such as Apache Kafka, Apache Beam, or Talend.
- Deep understanding of database design principles, data modeling, and architecture.
- Experience with cloud platforms, preferably AWS.
- Proficiency in backend programming languages, such as Python, Java, or Scala.
- Familiarity with tools like Docker and Kubernetes.
- Understanding of data encryption, tokenization, and other security protocols.
- Strong analytical and problem-solving skills, with a keen attention to detail.
- Excellent communication skills and ability to work collaboratively within a team.
Desirable Skills:
- Experience with big data technologies like Hadoop, Spark, or Flink.
- Knowledge of data visualization tools like Tableau, Looker, or PowerBI.
- Familiarity with Continuous Integration/Continuous Deployment (CI/CD) tools.
- Experience with Agile methodologies.
What We Offer:
- Competitive Salary and Bonus Structure
- Comprehensive Benefits Package
- Opportunity for Growth and Professional Development
- Collaborative and Innovative Work Environment