We provide IT Staff Augmentation Services!

Sr.snowflake Developer Resume

4.00/5 (Submit Your Rating)

Quincy, MA

SUMMARY

  • Having 16+ Years of development experience in various roles as Project Lead, Team Lead, Senior Programmer Analyst/ Lead Developer - ETL (Data stage); PL/SQL (Oracle); Snowflake Cloud DB, IICS, Data-warehousing, Unix Scripting, Analysis, Design, and Implementation of Business Applications.
  • Extensive experience in all phases of SDLC (Software Development Life Cycle) from Requirements Level Estimation, Functional and Technical Systems Analysis, Design Level Estimation, Design, Implementation, Testing, and Maintenance using ETL- Data stage, SQL/ PL/SQL (Oracle/DB2/ Sybase/ Microsoft SQL Server 2008), ETL ( Confidential Data stage & Informatica), XML, and Scripting - KIX (Windows Scripting) & Unix Scripting (sed, awk, Perl).
  • Experience in various Data Warehousing/ ETL programming project tasks such as data extraction, cleansing, aggregating, validations, transforming and loading, prepared Data mapping and analysis based on the data requirements, performed all the aspects of project documentation, including functional, technical specifications, testing templates, ETL mapping, report layout etc.
  • Work with Snowflake features - SnowSQL, Stored procedures, Snowpipe, Snowflake-Streams, Time Travel, Cluster Keys, Views, Copy, Validate and Snowflake API’s. Extensively work with IICS Cloud ETL and load data to snowflake.
  • Experience with Snowflake data migration migrate on premise Databases to Snowflake cloud.
  • In - depth understanding of SnowFlake Cloud Datawarehouse technology.
  • Having experience of Migrating DB objects into SnowFlake environment.
  • Heavily involved in testing Snowflake to understand best possible way to use the cloud resources.
  • Ability to develop ETL pipelines in and out of data warehouse and usage of Snowflake’s SnowSQL and write SQL queries against Snowflake.
  • Experience with Dimensional Modeling, Dimension tables, Fact tables, Star Schema, Snowflake Schema and understanding of Snowflake architecture, Time Travel, Fail Safe, Zero Copy Cloning & Data Sharing.
  • Experience with Data-warehousing Concepts - Data Analysis, Data Profiling, Master Data Management, Data Cleansing, Data Transformation, Loading, Data Mart design.
  • Experience with Data Flow diagrams, Database Normalization & De-normalization techniques, ER (Entity-Relational) modeling and design techniques, Dimensional Data modeling, design techniques.
  • Expertise in understanding, re-writing, and writing of complex SQL queries - Hierarchical Queries, Complex outer joins, Correlated Sub-queries, Hints for optimized data retrieval.
  • Performance tuning of SQL, PLSQL code by indexing, using hints, or optimal re-write - by using Explain Plan, TKProf, and other performance analysis techniques.
  • Effectively made use of Database Objects - Tables, Constraints, Indexes, Sequences, Triggers, Views
  • Used Database Links - to access data in other upstream/ downstream applications, and to access worktables from QA to Production - which would be used for application maintenance, External Tables - to access data from the file system as relational data.
  • Confidential Data stage (ETL Tool) - 12 years of Experience, with expertise in creation of Server and Parallel Jobs and their configuration settings, Data Sources, File (Sequential file, Complex file, Data set), Database Objects; Sequencer Stage, Exception Handler, Looping, Command Execution, User Variables Processing Stages, Transformer stage, Lookup stage, Merge stage, Funnel stage, Filter stage, Join stage, Modify stage, Remove Duplicates stage, Sort stage, Lookup, Change capture, Copy, Pivot, Remove Duplicates, Surrogate Key, Data set, Aggregator& Slowly Changing Dimension stage.
  • Worked with different source systems to design extraction criteria and setup automated overnight extraction jobs.
  • Materialized Views - to replicate data in distributed environments.
  • Complex Database Objects - Object Types, Object Tables, Functions, Stored Procedures, and Packages - to build client/server applications and batch processing.
  • Bulk Data load using - SQL Loader utility.
  • Expertise in creating Rule Engine Based Design/Implementation - using Dynamic SQL.
  • Experience with Oracle Provided packages, UTL File, UTL HTTP, UTL SMTP, DBMS OUTPUT, DBMS SQL, DBMS LOB.
  • Partitioning to maintain large database tables - Range partitioning, List Partitioning, used PL/SQL looping features, with Ref Cursors, external cursors bulk collect, collections, record types.
  • Exception Handling - Efficient exception handling for all the applications designed, which helps in Control and Error Reporting.
  • Oracle 10g/11g/ 12c/ Exadata - Good understanding of the Oracle DB Architecture and internal working mechanisms.
  • Expertise in Unix Scripting (used find, grep, sed, awk), developed scripts for complex text manipulation, for database interactions, creating complex batch programs, environment set up scripting, complex search scripts, complex compare scripts, complex parallel processors, handlers, directory, file management scripts, complex self-healing & fail-safe programs, file transfers scripts, schedulers scripts.
  • Experience in KIX Windows Scripting, developed many KIX scripts, to read, create, manipulate MS excel spread sheets, load data from/ to excel to/from database, text manipulation, call data stage jobs.
  • Executed software projects for Financial Sector – Group Insurance Industry, Telecom Industry, Transport and Manufacturing.
  • Personality Traits: Highly Motivated, Strong Logical/Analytical Thinking, Quick Learner, Lucid Communication, Excellent Interpersonal Skills, Team Player, Lead by Example, Project First Team First Attitude, Well Planned, Excellent Aptitude to learn new technologies and applications on job.

TECHNICAL SKILLS

RDBMS: Oracle 10g/11g/12c/Exadata, DB2 9, Sybase 15, MS SQL Server 2012

ETL Tools: Informatica Power center V9.5, Confidential Web sphere Data stage 8.x, 9.x, 11.5, 11.7, SQL*Loader

Programming Languages: SQL, PL/SQL

Scripting languages: Unix Scripting (with Sed, Awk,grep), Kix Windows Scripting

Schedulers: Unix Crontab, CA Workload Center Autosys R11

Operating Systems: Unix (Sun Solaris, HP Unix, AIX), Windows NT/98/95/2000 & XP

Development Tools: Oracle SQL Developer, SQLPLUS

Modeling tools: Toad, Erwin

Data Base Tools: OEM, AWR, TKProf

DevOps Tools: Jenkins, Zookeeper, and Artifactory

Cloud DBL: Snowflake Cloud DB, Amazon Redshift

Public Cloud: Amazon Web Services including EC2, VPC, S3, EBS, ELB, IAM, RDS, Route 53, Cloud front, Cloud-Watch

Version control Tools: Confidential ClearCase, SVN & GitHub

Other Tools: PVCS Version Manager, HP ALM (Application Lifecycle Manager)

Process: Waterfall Model, Agile Development

PROFESSIONAL EXPERIENCE

Confidential, Quincy, MA

Sr.Snowflake Developer

Responsibilities:

  • Business Requirements review with the client, worked with the Business Analysts to iron out the gaps in the business requirements, and provided advice on any application improvements or caveats.
  • Wrote SnowSQL, Stored Procedures and Design snowflake tables (Temporary, Transient, Permanent)
  • Working with Snowflake objects- Warehouses, Roles, Databases, Schemas, Tables, Views, Constraints and Snowflake Table Clustering Keys.
  • Work with Snowpipe to enables automatic loading of data from files including semi-structured data types such as JSON, Avro and Parquet.
  • Work with Snowflake-Streams, Time Travel, Copy statement, Validate and validate pipe load statements.
  • Requirement Analysis: Analyzed the requirements, to validate the assumptions about the application architecture against the actual application architecture based on Code and Data Analysis, update business of any discrepancies. Identify re-usability of the existing application modules which might satisfy the requirements and suggest any alternative approaches to reduce work effort.
  • Design – Technical Design Creation: Created comprehensive TDD (Technical Design Documents), clearly documented assumptions, with technical flow, Physical Data modeling – designed table structures, indexes, sequences, constraints, triggers, other database objects, Modularized design with technical functionality with Package/Procedure design with parameters, designed application level exception handling strategy which is followed as a standard, clearly defined the functionality of the Unix scripting, ETL functionality, Autosys Scheduler.
  • Development – Developed and lead end-to-end solution in integrating various technologies in the application for the respective project using ETL DataStage jobs, sequencers, mappings, transformations, UDFs, re-usable components, command tasks, Oracle PL/SQL Programming and writing Stored procedures in Microsoft SQL Server DB.
  • Migrated DataStage 8.5 Jobs to DataStage 11.5 as part of ITA App project.
  • Migrated DataStage 11.5 Jobs to DataStage 11.7as part of ETL Migration project.
  • Migrate on premise databases (MySQL, SQL Server, Oracle, Netezza) to Staging schemas of Snowflake.
  • Developed DataStage Jobs, Unix Scripts (including sed, awk), PL/SQL, and SQL Performance Tuning.

Environment: ETL- Confidential Infosphere DataStage V 8.5, 11.5, 11.7, Oracle 12C (SQL, PLSQL), Unix Shell Scripting,Snowflake Cloud DB 5.8.2, SnowSQL, MS SQL Server 2012, IICS Cloud ETL, AWS Cloud- SQS, EC2, S3, Redshift, RDS, Autosys, Microsoft Azure

Confidential, Roseland, NJ

Sr.Datastage developer /Lead

Responsibilities:

  • Business Requirements review with the client, worked with the Business Analysts to iron out the gaps in the business requirements, and provided advice on any application improvements or caveats.
  • Walked through existing Code to understand the Components and application.
  • Streamlined the code to identify redundancy, standardized the coding practices.
  • Analyzed the Incoming/ expected file data along with the Systems Analyst using Data profiling, created the ER-diagrams, possible data scenarios.
  • Work with Sequencer Stage of Confidential DataStage ETL – Exception Handler, Looping, Command Execution, User Variables Processing Stages – Transformer stage, Lookup stage, Merge stage, Funnel stage, Filter stage, Join stage, Modify stage, Remove Duplicates stage, Sort stage, Lookup, Change capture, Copy, Pivot, Remove Duplicates, Surrogate Key, Data set, Aggregator and Slowly Changing Dimension stage.
  • Integrating various technologies in the application for the respective project using ETL DataStage jobs, sequencers, mappings, transformations, UDFs, re-usable components, command tasks, Oracle PL/SQL Programming and Oracle Forms & Reports.
  • Developed DataStage Jobs, Unix Scripts (including sed, awk, and Perl), PL/SQL, and SQL Performance Tuning.
  • Worked as Scrum Master as part of compass upgrade project release, involved in agile framework implementation as part of project life cycle.
  • Created Code Review Standards for Unix Scripting, SQL, and PL/SQL.
  • Performed Code reviews – to identify missed requirements, check on coding standards, exception handling, cover all scenarios, scheduling, performance issues, other application impacts.
  • Improved performance, by reducing time between 50% up to 5%.

Environment: Sun Solaris, Oracle 10g & 12c, SQL, PL/SQL, Unix Scripting, ETL, Confidential DataStage V 8.5, 9.x & 11.5, Oracle Reports 10g, Oracle Forms 10g, Cobol, KIX Scripting, Web Services, SQL Server, T-SQL, Teradata, Erwin, Jira

Confidential, Roseland, NJ

Sr. ETL Database Developer/ Project Lead

Responsibilities:

  • Code Analysis: Reverse-engineered the application code, which is mainly in DB2 SQL, PL/SQL, Unix Scripts, Cron jobs, to understand the application and document the functional flow.
  • Data Analysis: Analyzed the Production level data, which was dimensional, and created the ER-diagrams, possible data scenarios, made sense of undocumented application functionality using data, identified application issues based on data discrepancies – used aggregate functions with group by, cube functions, correlated sub queries, partition by clause, set functions.
  • Requirements Analysis: Analyzed the requirements, to validate the assumptions about the application architecture against the actual application architecture based on Code and Data Analysis, update business of any discrepancies. Identify re-usability of the existing application modules which might satisfy the requirements and suggest any alternative approaches in order to reduce work effort and suggest any additional features which might benefit the stakeholders.
  • Technical Design Creation: Created comprehensive TDDs (Technical Design Documents), clearly documented assumptions, with technical flow, Logical and Physical Data modeling – designed the Dimension tables, Fact tables, Star/Snow Flake Design, indexing for data warehouse database, constraints, triggers, other database objects, Modularized design with technical functionality with Package/ Procedure design with parameters, designed application level exception handling strategy which is followed as a standard, clearly defined the functionality of the Unix scripting, ETL functionality, setting of corn jobs.
  • Rule Engine Designs: Designed applications with generic reusable design, which works by configuring the rules set up. Created Generic Load Framework – which would load the data from Flat Files, after data cleansing & formatting, into the dimension tables. Created Generic Validation and Staging Framework – which would validate the input data against business rules and compares the data to its predecessor record on the database. Created Generic Aggregation Framework – which would aggregate the facts, based on the keys set up, and stored them on the fact tables.
  • Developed ETL Frameworks, such as Generic Loader Framework which was developed mainly in Unix & Awk scripting for reformatting data into standard format, and load data into database staging tables using the DB2 load utility. Generic Validation & Stage Framework which would trigger the complex PL/SQL procedures to perform data validations –
  • Validations based on the requirements,
  • To validate the data against the data already present in the database (partial refresh), identify, and load the change records to the active dimension tables, move the old data to history tables.
  • Generic Aggregation Framework which would aggregate the data at the level and load the aggregated data to the fact tables.
  • Created Reporting Framework – which creates the control and error reports using the process, staging & dimensions tables.
  • Developed Complex Unix & Awk Scripts (including sed, awk, Perl), PL/SQL, SQL Performance Tuning, and SQL Reports.
  • Developed scripts to generate test data for unit testing.

Environment: Sun Solaris, Windows, DB2, SQL, PLSQL, Unix Scripting, ETL, Crystal Report

Confidential, Charlotte, NC

Sr. PL/SQL & DataStage Developer, Module Lead

Responsibilities:

  • Perform Reverse-Engineered the Data stage Jobs, PL/SQL Code, and Unix Scripts to understand the application and document the functional flow.
  • Design ETL Jobs in DataStage ETL. Create ETL schedules and Automate the data loading.
  • Created comprehensive FSDs (Functional Specification Documents), clearly documented scope and assumptions, Logical Data Design with ER-Diagrams, with functional flow for existing and new functionality, suggestions for technical implementations, performance bottleneck and volume testing, other helpful info which saved technical design and development effort.
  • Created comprehensive TDDs (Technical Design Documents), clearly documented assumptions, with technical flow.
  • Work with Physical Data modeling – designed table structures, indexes, sequences, constraints, triggers, other database objects, Modularized design with technical functionality with Package/ Procedure design with parameters, designed application-level exception handling strategy, which is followed as a standard, clearly defined the functionality of the Unix scripting, Cron Jobs.
  • Designed applications with generic reusable design, which works by configuring the rules set up. Designed several complex rule engines, for various functionality such as – loading the input data from external sources, validation programs, self-healing programs based on the rule’s setup, and lot more.
  • Developed Data stage, Unix Script (including sed, awk, Perl), very complex PL/SQL, and SQL Performance Tuning.
  • Developed scripts to generate test data for unit testing.

Environment: AIX, Korn Shell scripts, Oracle 10g, Confidential WebSphere, Data stage 8.1, Jira

Confidential, Stamford, CT

ETL Developer/ Lead

Responsibilities:

  • Designing architecture of the Project, Designing & Developing ETL Jobs, Performing POC’s, Delegation of work and responsible of timely deliverables

Environment: Oracle 10g, SQL Server, Data stage 8.1, Information Analyzer, Quality Stage 8.1

Confidential, Stamford, CT

ETL Lead

Responsibilities:

  • Requirement analysis, Client Interaction, Cost estimate, effort estimation, Data modelling, designing architecture of project and DataStage job design.

Environment: Oracle 10g Database, Data stage 8.1, Oracle (SQL & PL/SQL)

Confidential, Stamford, CT

ETL Lead

Responsibilities:

  • Involved in design, create, and implement Battery life Data warehouse using Parallel Extender (Enterprise Edition) DataStage 7.5.1A as ETL.
  • Involved Conceptual data modeling, logical and physical data modeling, Data extraction from Tier3, data transformation, data cleansing and loading the data into the Oracle Database.
  • Interaction with client and subject matter experts to understand the requirements, preparation of technical design document and implementation of the ETL module.
  • Involved in data review for the data coming from different sources and to come up with the data discrepancies and suggestions.
  • Moved the Battery life Data Stage Jobs to different phases of the Software cycle (Dev to QA to Production).

Environment: Oracle 10g Database, Data stage 8.1, Oracle (SQL & PL/SQL)

Confidential, Stamford, CT

ETL Lead

Responsibilities:

  • Participating in Gap analysis, coordinating with SME’s for Capturing High Level Requirements, Technical Analysis of requirements to identify gaps and challenges for integration.

Environment: Oracle 10g Database, Data stage 8.1, Oracle (SQL & PL/SQL)

Confidential, Stamford, CT

ETL Developer & PL/SQL Senior Developer

Responsibilities:

  • Involved in design, create and implement Mailstream Data warehouse using Parallel Extender (Enterprise Edition) Data Stage as ETL.
  • Involved Conceptual data modeling, logical and physical data modeling, Data extraction from Tier3, IMS and Broad vision (PDFM File), data transformation, data cleaning and loading data into Oracle Database.
  • Interaction with client and subject matter experts to understand the requirements, preparation of technical design document and implementation of the ETL module.
  • Involved in data review for the data coming from different sources and to come up with the data discrepancies and suggestions.
  • Moved Mailstream DataStage Jobs to different phases of the Software cycle (Dev to QA to Production).

Environment: Confidential Web sphere Data Stage 7.5.1, Oracle Database 10g

Confidential

ETL & PL/SQL Coding

Responsibilities:

  • Data Modeled with Erwin, Requirement analysis, Client Interaction, writing database Stored Procedures, functions, troubleshooting, making test plans, project development.
  • Involved in deployment, Maintenance and Support and of the application at 23 client locations.

Environment: Oracle 10g Database, Data Stage 7.5.1 & Oracle (SQL & Pl/SQL)

Confidential

PL/SQL Developer

Responsibilities:

  • Data Modeled with Erwin, Requirement analysis, Client Interaction, writing database Stored Procedures, functions, troubleshooting, making test plans, project development and coding.
  • Involved in deployment, support, and maintenance of the application at 2 client locations.

Environnent: Oracle 10g, Pl/SQL

Confidential

PL/SQL Developer

Responsibilities:

  • Involved in application development of the modules like MCD and ACSP.
  • Data migration is a major part of dis application development.

Environnent: Oracle 10g, Pl/SQL

We'd love your feedback!