Details
UBS is hiring for the position of Data Engineer – Scala!
Responsibilities
- Data Pipeline Engineering:
- Build reliable data pipelines for sourcing, processing, distributing, and storing data using cloud data platforms effectively.
- Data Transformation:
- Convert raw data into valuable insights to inform business decisions by leveraging internal data platforms and analytical techniques.
- Automation & Problem-Solving:
- Develop and apply data engineering techniques to automate manual processes and address challenging business problems.
- Solution Quality & Compliance:
- Ensure solutions meet quality, security, reliability, and compliance standards by adhering to digital principles and implementing functional/non-functional requirements.
- Observability & Incident Resolution:
- Incorporate observability into solutions, monitor production health, and assist in incident resolution and root cause analysis.
- Client Advocacy:
- Understand, represent, and advocate for client needs in data engineering solutions.
- Knowledge Sharing:
- Codify best practices and methodologies while sharing knowledge with other UBS engineers.
- Architecture Development:
- Shape the Reference Data Mastering and Distribution architecture within UBS’s new cloud-based datalake-house.
Requirements
- Technical Expertise:
- Experience with Distributed Processing using Databricks (preferred) or Apache Spark.
- Proficient in Scala with expertise in debugging and optimizing Spark Jobs.
- Experience with cloud platforms such as Azure (preferred) or AWS.
- Data Handling Skills:
- Ability to work across structured, semi-structured, and unstructured data, identifying linkages across disparate datasets.
- Expertise in creating optimized data structures (e.g., Parquet, Delta Lake).
- Database Proficiency:
- Experience with at least one database technology:
- RDBMS (MS SQL Server, Oracle, PostgreSQL).
- NoSQL (MongoDB, Cassandra, CosmosDB, Gremlin, Neo4J).
- Experience with at least one database technology:
- Security & Compliance:
- Understanding of Information Security principles to ensure compliant handling and management of data.
- Tools & Platforms:
- Experience with ETL tools like Azure Data Factory and Informatica.
- Proficiency with GitHub, Gitflow, and development tools like IntelliJ.
- Agile Methodologies:
- Practical experience with SCRUM, XP, or Kanban in Agile workflows.
- Analytical & Problem-Solving Skills:
- Strong analytical skills and problem-solving abilities to work on large, complex codebases.
This is an exceptional opportunity to work with cutting-edge cloud technologies, solve complex business problems, and contribute to UBS’s innovative data engineering landscape. Apply now to join UBS and advance your career!