- Home
-
Private banking
-
LGT career
-
Market view and Insights
LGT is the largest Private Banking and Asset Management group in the world to be owned by an entrepreneurial family. As the family office of the Princely House of Liechtenstein, we have years of experience in the management of sizeable sums of assets.
We are looking for a Data Engineer to join our Platform Engineering team on a 12‑month fixed‑term contract. This role bridges data engineering, analytics, and applied data science, with a strong focus on reusability, explainability, and regulatory alignment.
You will work closely with the Principal Engineer to design, build, and operationalise a high quality, governed data foundation that enables analytics, advanced modelling, and AI use cases across the organisation.
Key Responsibilities
Contribute to the design and evolution of the enterprise data foundation, including core and presentation data layers, in line with the modern data stack and logical data models already defined.
Build well‑defined, reusable data products (datasets, features, semantic models) that can be consumed by analytics, Artificial Intelligence (AI) models, and downstream applications.
Partner with data engineers to define data structures that support historical accuracy, auditability, and lineage, which are critical in a regulated wealth environment.
Perform exploratory data analysis to identify patterns, data quality issues, and AI opportunities.
Support AI use cases by ensuring data is fit‑for‑purpose, well‑documented, and reproducible.
Work within the enterprise data governance framework, including Data quality rules and monitoring; Metadata, lineage, and glossary contributions.
Collaborate with Data Owners to resolve data issues and clarify definitions.
Ensure datasets and models meet regulatory, privacy, and audit requirements relevant to financial services.
Collaborate closely with engineering, architecture, compliance, and business teams.
Contribute to standards, templates, and best practices for analytics and data science delivery.
Support enablement of other teams by creating reusable assets and documentation.
Working alongside your IT infrastructure, project delivery and security colleagues, you will take ownership for the day to day operability of the data integration platform to ensure business functions run smoothly.
Skills & Experience
Strong experience with Python for data analysis and modelling (e.g. pandas, NumPy, scikit‑learn or equivalent).
Solid SQL skills and experience working with cloud data warehouses and lakehouses.
Experience working in a modern data platform (e.g. Microsoft Fabric, Synapse, Snowflake, Databricks).
Understanding of data modelling concepts (e.g. dimensional models, Data Vault or similar enterprise patterns).
Proven experience working on data foundations, not just dashboards or isolated models.
Experience creating reusable, governed datasets or features intended for multiple downstream consumers.
Familiarity with metadata management, data lineage, and data quality frameworks.
Experience supporting AI use cases or early‑stage production models is highly desirable.
Experience working in financial services or another regulated industry.
Awareness of data privacy, auditability, and model risk considerations.
Ability to balance innovation with control, documentation, and traceability.
Strong analytical thinking with the ability to explain complex concepts simply.
Comfortable working with ambiguity and helping shape the right data approach rather than being handed fully‑formed requirements.
Collaborative mindset and ability to work closely with senior engineers, architects, and business stakeholders.
Understanding of CI/CD pipeline experience.
Desirable Skills
Experience with Microsoft Purview or similar governance/cataloguing tools.
Exposure to GenAI use cases (LLMs, embeddings, retrieval‑augmented generation).
Experience contributing to an AI or Data Centre of Excellence model.
Experience in preparing data assets specifically for AI and GenAI use cases, including feature engineering, embedding, and structured/unstructured data preparation.
Degree or qualification in machine learning.
Role Competencies
Strong problem-solving abilities, with a logical and methodical approach to tasks.
Excellent communication skills, able to translate technical concepts for non-technical stakeholders.
Commitment to maintaining high-quality standards in all tasks.
Ability to manage changing priorities and work in a dynamic, and a proactive manner.
A passion for emerging technologies and an interest in industry developments in this fast-moving sector.
Qualifications
Industry certification in the above technologies will be highly desired
Tertiary degree in software engineering, computer science or related principles
Transparency is important to us. That is why you will find everything that matters to us on our website – plus everything you should know about us before you meet us in person, open an account or apply for a job. That includes, for example, the history of the Princely Family, which is closely intertwined with our own.