Summary
Overview
Work History
Education
Skills
Currentemployer
Certificatestrainings
Toolstechnicalskills
Consultingskills
Languages
Timeline
Generic
Samar Zaidi

Samar Zaidi

Data Architect At IBM
Islamabad,IS

Summary

A seasoned Data Architect with over 11 years of expertise in Data Warehouse design, ETL architecture, and cloud-based solutions, Successfully led complex data projects across the telecom, finance, education, retail, and manufacturing sectors. Proficiency in optimise data workflows, modernise legacy systems, and building scalable, cost-efficient enterprise architectures has consistently driven operational improvements.

Specialise in managing full project lifecycles, directing teams in requirement analysis, technical documentation, and solution delivery. Equipped with deep knowledge of SQL, Hadoop ecosystems (Hive, Spark), and cloud platforms (AWS, Azure), Design and implement robust data infrastructures tailored to business needs.

Dedicated to leveraging cutting-edge technologies, Continuously deliver innovative solutions that streamline operations and ensure high performance. Eager to contribute skills to impactful, forward-thinking projects.

Overview

11
11
years of professional experience
4
4
years of post-secondary education

Work History

Data Architect at IBM

Meezan Bank
Karachi, Sindh
10.2023 - Current
  • Designed internal process improvements to automate repetitive tasks, shortening data delivery times.
  • Migrated almost 100 plus legacy systems to newer technologies, reducing costs and enhancing efficiency of computing tasks.
  • Developed and delivered 10 business information solutions.
  • Designed data models for complex analysis needs.
  • Gathered, defined and refined 100 reports requirements, led project design and oversaw implementation.
  • Unified disparate data structures and operating environments during revamps of processing facilities, data centers and more.
  • Verified isolation of data to specific geographic borders to comply with national and international data transmission laws and regulations.
  • Drafted conceptual and logical data models for high-level system planning tasks, optimizing each model to customers' needs and budgets.
  • Created multi-site system architecture plans to reduce redundancy across entire organization.

Data Architect at IBM

PMI
United Kingdom, London
10.2022 - 10.2023
  • Taken complete DA side responsibility of Very complex TOMAUTO Build and completed Presentation and Data Product, PowerON underline design and build for TOMAUTO Project (TOM is critical segment of PMI Spare Parts Obsolescence Management).
  • Built WhereScapeRED objects and design process, obsolete spare parts strategy, and write-back RLS logic.
  • Resolved multiple implementation issues, which make the delivery timeline smooth.
  • Completed Airflow Migration and addressed all necessary changes.
  • Involved in Global obsolete strategy (GOS) testing for TOMAUTO Auto strategy with Business Stakeholders to make GOS strategy for 90+ GD Machines.
  • Designed internal process improvements to automate repetitive tasks, shortening data delivery times.
  • Migrated numerous legacy systems to newer technologies, reducing costs and enhancing efficiency of computing tasks.

Data Architect at IBM

United Bank Limited
Karachi, Sindh
05.2022 - 10.2022
  • Greenfield implementation of DL & DWH using Informatica Developer, Teradata, CDP for United Bank Limited (UBL), which is in its journey to digitalize their bank.
  • Automated various use-cases for the bank, for example, ATM Cash Optimisation, Branch Cash Optimisation, Fraud Detection, Near Real-time transactions, Customer 360, and Churn.
  • Data is ingested from source onto Hadoop, where transformations of data is being performed.
  • Created Semantic layer (SDM) for dashboards.
  • Data was extracted from the source system via dynamic mapping to reduce development efforts on commonly repeating tasks.
  • Interaction with key stakeholders to understand and translate business needs.
  • Define and implemented Data Management, governance, and Security Policies.
  • Assure maximum data quality level for Reference and Master data repositories, manage ETL batches.
  • Write custom ETL scripts for new source systems and facilitate ad-hoc/on-demand and customized reporting requests.
  • Real-time branch transaction monitoring dashboard use case.
  • Migrated numerous legacy systems to newer technologies, reducing costs and enhancing efficiency of computing tasks.
  • Designed internal process improvements to automate repetitive tasks, shortening data delivery times.

Senior Data Engineer at Teradata Pakistan

Jazz Pakistan
Islamabad, IS
04.2021 - 05.2022
  • Jazz has initiated a data warehouse technology refresh project in order to modernise their data ecosystem to serve their approximately 70 million internet and voice customer base.
  • Working as an ETL Development and Capability Senior Data Engineer. Below are his main responsibilities: managing the technical team, assigning tasks, and measuring team performance.
  • Creation of ETL Strategy.
  • Data Lake setup on object store (MinIO) using Teradata NOS.
  • SLJM setup (Shell-based ETL Tool)
  • Creation of file management and direct data pull from databases utilities.
  • Enhance the functionalities of Teradata GCFR and DataOps tools according to the project need.
  • Building and execution of ETL pipelines for Staging, Core, and Access layer using GCFR and DataOps.
  • Ensured data quality through rigorous testing, validation, and monitoring of all data assets, minimizing inaccuracies and inconsistencies.
  • Reengineered existing ETL workflows to improve performance by identifying bottlenecks and optimizing code accordingly.
  • Championed the adoption of agile methodologies within the team, resulting in faster delivery times and increased collaboration among team members.
  • Participated in strategic planning sessions with stakeholders to assess business needs related to data engineering initiatives.
  • Designed robust database architecture that supported seamless integration of new datasets and facilitated rapid analysis capabilities.

Data Engineer

Telenor Pakistan
Islamabad, IS
11.2019 - 05.2021
  • Design and implementation of the architecture of the Revenue Assurance Department of Telenor Pakistan.
  • Designing and Implementing ETL Data Pipelines for data lake.
  • Coordinating with business and stakeholders for requirement gathering and documenting more then 100 plus use cases.
  • Designing system architecture as per best practices.
  • Reviewing the change request and implementation.
  • Functional and technical documentation.
  • Performance optimisation, Scoping, and Capacity planning.
  • Develop data wrangling, data pipelines, ETL, data control, and profiling processes.
  • Analyse current processes and technologies, contributing to the integration of new solutions.
  • Document functional and non-functional user requirements and specifications.
  • Design new business processes, capabilities, and supporting technologies.
  • Design on-premises and cloud-based solutions for Data Lake and warehouse.
  • Capture ETL/BI requirements
  • Designed the processes to meet service level agreements for data timeliness and frequency.
  • Determine data growth trends and peak business periods.
  • Developing reports/dashboards based on the requirements of the functional units.
  • Optimized data processing by implementing efficient ETL pipelines and streamlining database design.
  • Collaborated on ETL (Extract, Transform, Load) tasks, maintaining data integrity and verifying pipeline stability.
  • Led end-to-end implementation of multiple high-impact projects from requirements gathering through deployment and post-launch support stages.
  • .Fine-tuned query performance and optimized database structures for faster, more accurate data retrieval and reporting.

Analytics Operations Executive

Telenor Pakistan
Islamabad, IS
04.2018 - 11.2019
  • Requirement gathering, planning, designing, development, and deployment of all change requirements while maintaining the overall performance of ETL processes.
  • Detailed Space Analysis using subscriber forecasted trends on the production system and applying retentions after impact analysis in order to manage space consumption and data growth.
  • Oversaw daily operations activities, ensuring smooth functioning across all departments while maintaining high-quality standards.
  • Efficiently and effectively identified and solved all problems that impacted direction of business.
  • Used data-driven decision-making techniques to identify areas of improvement within operations processes.
  • Implemented strategic initiatives that resulted in significant improvements in operational performance.
  • Collaborated with executive leadership on key initiatives, providing valuable insights from an operational perspective.
    Set clear goals to monitor targets and offered real-time input on performance and motivation.
  • Hadoop cluster capacity planning, performance tuning, cluster monitoring, troubleshooting.
  • Installation of various Hadoop Ecosystems and Hadoop Daemons of huge 1000 terabyte cluster.
  • Experience in monitoring and troubleshooting issues with Linux memory, CPU, OS, storage, and network.
  • Hands-on experience in analysing log files for Hadoop and ecosystem services and finding the root cause.
  • Experience in commissioning, decommissioning, balancing, and managing nodes, and tuning servers for optimal performance of the cluster.
  • Cluster maintenance, troubleshooting, monitoring, and following proper backup & recovery strategies.
  • Installing and configuring the Hadoop ecosystem, including Sqoop, Pig, and Hive.
  • Expertise in Hive Query Language and debugging Hive issues.
  • Experience in importing and exporting data using TDCH from HDFS to Relational Database systems/mainframe and vice versa.
  • Optimising performance of Hive, Pig jobs.
  • System Performance Monitoring through Viewpoint and detailed performance analysis using weekly and daily performance reports' configurations with workload customisation.

Enterprise Data Warehouse (EDW) Operation Officer

Telenor Pakistan
Islamabad, IS
05.2016 - 04.2018
  • Creating a Hadoop TEST Cluster.
  • Performance Monitoring of Hadoop through Ambari.
  • Monitoring Data Warehouse ETL Operations and ensuring timely availability of data.
  • Follow up with Telenor source teams like BSS and Network Element Systems to ensure data integrity and completeness.
  • Implementation of Change Requests as per business requirement.
  • Analysis and Investigation of abnormal Data Trends, and carry out reconciliation activities with source teams.
  • Developed recon for major business KPIs in order to ensure data quality and timely issue investigation.
  • FASTLOAD, BTEQ, MULTILOAD, TPT Scripts Development
  • Automation of daily reports and processes using Shell Scripting.
  • Entertaining ad hoc requests from RA and Fraud Control.
  • Played a significant role in the successful rollout of DPI deployment in production, also involved in optimization of its time-consuming scripts.
  • Played a significant role in the optimization of BVS's time-consuming scripts.
  • Created DEV Environment on a 680 Teradata machine, which involves the creation of Data Mover Scripts.
  • Movement of tables from Prod to Dev machine.
  • Support developers according to their requirements on ongoing projects on DEV environment.
  • Automated process for daily syncing of data from Prod to DEV environment.
  • Planning, preparing UAT, and deployment strategy document.
  • Perform UAT with client, deployment, and DEVOPS of the developed solution.
  • Optimization of deployed solution.
  • Managed cross-functional teams for the successful completion of projects within budget and timeline constraints.
  • Updated and published standard operating procedures (SOPs) using stakeholder, customer and employee input and feedback resulting in clearer and more useful instruction for users.
  • Optimized business processes to foster operational efficiency.

Professional Service Consultant

Teradata Pakistan
Islamabad, IS
09.2014 - 05.2016
  • Orbit EDW Migration, Telenor Pakistan Telecommunications, Islamabad. Day-to-day KPI matching.
  • Data Recon for both the new production system and the old production system.
  • Data quality testing on new production system to ensure integrity.
  • Day-to-day creation and updating of reports to be shared with the client.
  • Verification of client report shared with TD.
  • Identification of tables to be moved from 6800 machine via data mover.
  • Facilitate TRAP (Telenor Revenue Assurance Project) loading on 2800 machine.
  • Development, Testing, and execution of data mover scripts.
  • Automation of Data Mover Jobs.
  • Scheduling of Data Mover Jobs.
  • Developing TDCH scripts for BLC tables, move from Teradata to Hadoop.
  • Development of TDCH Scripts.
  • Automate TDCH jobs.
  • Scheduled TDCH jobs.
  • Analyzing each file column by column to identify issues in data.
  • Developing counters and key quality indicators. Carried out a POC for NEA to benchmark system performance, to ensure EDW integration.
  • Calculating system resources for single batches of NEA running parallel with other systems.
  • Ensuring successful integration with EDW, for example, cell site subscription tagging.
  • Created strategies that propelled business forward in industry.
  • Collaborated with cross-functional teams to develop innovative solutions, ensuring successful project completion within budget and timelines.
  • Facilitated knowledge sharing and collaboration among team members by organizing seminars, workshops, and creating internal resources.
  • Boosted overall client satisfaction ratings through consistent delivery of high-quality services, tailored solutions, and timely support.
  • Delivered high-quality presentations to clients, demonstrating the value of proposed solutions and securing buy-in from key stakeholders.

Education

Bachelor of Science - Software Engineering

University of Sargodha
Sargodha, Punjab, Pakistan
04.2010 - 01.2014

Skills

SQL Expertise

Currentemployer

IBM

Certificatestrainings

  • Teradata utilities (BTEQ, FEXP, MLOAD, and FASTLOAD) Training
  • Agile Project Management Training
  • TERADATA NOS Training
  • Data Integration & Big Data ETL Training
  • GCFR Training
  • Spark and Big Data Trainings
  • Conducted SLJM ETL shell base Tool Training
  • Data Vault Training
  • Where Scape 3D Modeling Tool Training
  • Where Scape Red, Airflow Trainings

Toolstechnicalskills

  • Snowflake
  • AWS
  • Azure
  • WhereScape 3D
  • DWH & Analytics
  • Teradata GCFR & DataOps
  • Data Modeling
  • Data Integration
  • Scripting/ Custom ETL
  • Informatica BDM 10.4.1
  • Teradata Tools & Utilities
  • Apache Spark
  • Technical Documentation
  • Shell Scripting
  • Big Data Technologies
  • Kerberos
  • Rangers
  • Agile Methodology

Consultingskills

  • Team Leading
  • Project Management
  • Stakeholders Mgmt.
  • Product Management
  • Pre-Sales Consulting
  • Resource Management
  • Cost & Budget Control
  • System Analysis
  • Interpersonal Skills
  • Time Management

Languages

English
Urdu

Timeline

Data Architect at IBM

Meezan Bank
10.2023 - Current

Data Architect at IBM

PMI
10.2022 - 10.2023

Data Architect at IBM

United Bank Limited
05.2022 - 10.2022

Senior Data Engineer at Teradata Pakistan

Jazz Pakistan
04.2021 - 05.2022

Data Engineer

Telenor Pakistan
11.2019 - 05.2021

Analytics Operations Executive

Telenor Pakistan
04.2018 - 11.2019

Enterprise Data Warehouse (EDW) Operation Officer

Telenor Pakistan
05.2016 - 04.2018

Professional Service Consultant

Teradata Pakistan
09.2014 - 05.2016

Bachelor of Science - Software Engineering

University of Sargodha
04.2010 - 01.2014
Samar ZaidiData Architect At IBM