-
Hands-on experience in “Big Data” technologies such as Hadoop, Hive, Kafka and Spark, as well as a general understanding of the fundamentals of distributed data processing (data lakes, ETL/data warehousing, DB design).
-
Extensive Data analysis experience with Agile, Waterfall Methodologies and Scrum Project Management, Program Management, Master data Management, Product Owner, Software Development Life Cycle phases from Requirement Gathering, Analysis, Design, Development, Implementation, Integration, and Data Migration from legacy systems to NOSQL and RDBMS Applications.
-
Extensive experience in performing various roles as Data Architect, Modeler, Developer, Data and ETL Modeler and Data Analyst, Agile Project Manager, Product Manager, Program Manager, Technical Team Lead, ETL Architect and Analyst. Must have managed multiple projects with OLAP, OLTP, ODS, EDW, MDM, Data Lake, Data Vault, Data Governance, Data Profiling, Cleaning, Defining and designing Anchor Modeling, Focal Point Data models
-
Experience in Co-ordinating with RDBMS, Hadoop, Web Services, AWS, S3, EC2, Dynamo DB, SOA, REST API, SAP HANA, ETL Development trams, Implementing Single Sign on Architecture.
-
Expert in Designing and Implementing Star, Snowflake, and Galaxy Schema Models.
-
Worked extensively with Teradata, DB2, Oracle, My SQL, SQL Server, Sybase, AWS Redshift, S3, EC2, Snowflake and Postgre SQL, MS Azure, COSMOS DB, Casandra DB,
-
Extensively involved with Application Development, Middleware, Servers, Storage analysis, Database management, Technical and functional Operations of the Business at Enterprise Level.
-
Established Data extraction methodology for Oracle, SQL Server, and My SQL to migrate Bigdata and Hadoop, Snowflake Cloud DB, Time Ttravel, Zero Cloning.
-
Hands on experience and strong proficiency with either Scala, Python or Java.
-
Cloud technologies experience/knowledge is a plus (GCP, AWS, Azure).