Sr. Snowflake Developer Resume
Chicago, IL
SUMMARY
- Snowflake Developer with 8 years of experience in developing Data Warehousing and Data Analytics applications for Banking, Financial and Insurance sectors.
- 4+ years’ experience in Software development using Snowflake DWH
- 5+ years in SQL, Data Warehousing, Python
- Snowflake Core Certified Professional with 3 years of strong experience cloud data warehouse implementation on AWS.
- Strong in SQL programming with stored procedures, temporary and, indexes, functions, triggers, views, database performance tuning and writing simple/complex queries.
- Hands on experience in building EDW, Data mart, Operational data store (ODS) and having good programming knowledge and design skills using Snowflake utilities, ETL/ELT applications.
- Complete end - to-end SDLC knowledge (Requirement analysis, Design, Development, Deployment and Support). Involved in end-to-end development, Maintenance, Enhancement and Performance improvement projects.
- Hands-on experience in bulk loading and unloading data into snowflake tables using COPY command
- Strong working exposure on ETL testing and Report testing
- Good working experience on using tasks in snowflake to scheduling the jobs using CRON and no CRON variants
- Good knowledge on Data Analytics using Python
- Troubleshooting performance with Query Profile, ability to handle large volume data
- Used COPY/INSERT, PUT/GET commands for loading data into snowflake tables from internal/external stage
- Experience in using snowflake clone and time travel
- Good working experience on developing stored procedures and user defined functions using Python API/SQL Scripting
- Good Working experience on using SQL profiler to see the queries/tables involved while debugging the application issues
- Hands-on working experience on using Snowflake and Python ETL to load data from analytics environment to on prem databases
- Hands-on experience on snowflake utilities, Snow SQL, Snow Pipe, JavaScript API for creating the snowflake stored procedures
- Experience in using Secure data transfer feature in snowflake to share the data to other snowflake and third-party consumers
- Have strong analytical, creative problem solving, decision-making, good communication skills, and interpersonal skills.
- Result oriented work ethics and able to adapt to new technologies.
- Hands on experience on creating the snowflake Secure Views, Materialized views, Regular Views
- Good Experience on different tables and usage of the tables
- Experience in define virtual warehouse sizing for snowflake for different type of workloads
- Provided Production support for the applications deployed to Production
- Experience on SQL, Unix Shell scripting (Korn) and PL/SQL Programming.
- Proficient in using databases like Oracle 1/1g10g/9i/8i, Postgres, My SQL, PL/SQL, Stored Procedures.
- Good knowledge on Data Warehousing concepts like Dimensional Data Modelling, Star schema, Snowflake schema, creation of Fact and Dimension Tables, OLAP, OLTP, etc.
TECHNICAL SKILLS
Cloud Technologies: Snowflake, AWS, Athena, Matillion
Programming Languages: Python 3.6, JavaScript, Pandas, Snow SQL
Databases: Snowflake Warehouse, SQL, SQL Server, MYSQL, PostgreSQL
ETL Tools: Snowflake, Matillion
Domain/Functional Area: Banking, Financial, Insurance
PROFESSIONAL EXPERIENCE
Confidential, Chicago IL
Sr. Snowflake Developer
Responsibilities:
- Used Secure data transfer feature in snowflake to share the data to other snowflake and third-party consumers
- Created Data Integration framework in Snowflake for both batch and real time data from different file formats(CSV, JSON, XML) using snowflake Stage and Snowflake Data Pipe
- Created internal and external stages and transformed data during the load
- Developed Regular/Materialized/Secure views in snowflake as per the business requirements to improve performance and row level security for security considerations
- Implemented column level security by using MASKING POLICY for sensitive/PHI data
- Developed Stored procedures and user defined functions using JavaScript API/SQL Scripting
- Used Cloning and Time Travel features in Snowflake to ensure maintenance and availability of historical data
- Defined roles, privileges required to access different database objects
- Participated in development improvement and maintenance of Snowflake database applications
- Used TASKS in Snowflake to scheduling end to end ETL workflows using streams, Snow pipe
- Created STREAMS in snowflake to capture DML changes happened on the source tables/external tables
- Created External Tables in Snowflake to process data in the external stage in the Cloud
- Used temporary and Transient tables for different databases (staging).
Environment: Snowflake, AWS, Athena, Matillion, JIRA, Devops.
Confidential, Chicago IL
Snowflake Developer
Responsibilities:
- Created Snow pipe for continuous data load integration from S3 Bucket
- Snowflake/Python ETL to load data from analytics environment to on prem databases
- Validating the data from SQL Server to Snowflake to make sure it has Apple to Apple match
- Used FLATTEN table function to produce a lateral view of VARIANT, OBJECT and ARRAY columns
- Cloned production data for code modifications and testing
- Developed SQL queries using Snow SQL
- Used COPY, LIST, GET and PUT commands to validating the internal stage files
- Used VALIDATE () table function to validate the past execution of data loading using COPY statement
- Used COPY HISTORY () and LOAD HISTORY () to validate the history of data loaded using COPY statement using bulk data loading and continuous data loading.
- Created internal and external stages and transformed data during the loading.
- Performed data quality issue analysis using Snow SQL by building the analytical ware houses on the Snowflake.
Environment: Snowflake, Python, Devops
Confidential, Chicago IL
ETL/SQL Developer
Responsibilities:
- Participated in the KT sessions of the BIP project and understood the aim, requirements and architecture of the project. Meet with the business analysts to understand the requirements to on-board the new data sources
- Analyse and review the business specification documents to identify sources, understand the transformations required to map the source attributes into final targets and create Source-To-Target mapping documents to map source attributes to target attributes
- Efficiently used different components in the ETL component library to implement various requirements
- Created or updated ETL’s to read the source data and mapped to the targets to solve various business requirements.
- Reviewed the code with the development team and updated for any review comments
- Worked on performance enhancement of graphs using skew optimization techniques using partitioning components, different parallelism concepts like data, pipeline and component parallelism, lookups, efficiently utilizing memory, in-memory vs sorted rollups/joins(when required) etc.
- Checked in the code to repository and migrated to higher environments.
- Proactively given support to Application Support Group for ETL jobs by developing hot fixes for failed jobs, monitoring the processing time for critical graphs and validating the data at the source and target ends.
- Participated in daily stand-up meetings to provide daily updates on the work
- Involved in performance testing of ETL by running the schedules with prod volumes and monitoring the system performance.
Environment: SQL, Stored Procs, SSIS, SQL Server.
Confidential, Durham NC
SQL/ETL Developer
Responsibilities:
- Participated in the KT sessions of the RRDDW project and understood the aim, requirements and architecture of the project.
- Understood the review cycle of the project. Analysed and reviewed the release requirements and Solution Design with the Designers so as to have proper understanding of the requirements.
- Performed the necessary changes as per the release requirements, performed unit testing and deployed the code to QA and UAT environments.
- Prepared ETL standards, naming conventions and wrote ETL workflow documentation for stage, ODS and Mart
- Prepared ETL mapping specifications for stage loads and target loads
- Designed SSIS packages to load the data from source systems into target systems
- Created packages using SSIS for data extraction from Flat Files, Excel Files and OLEDB to SQL Server
- Prepared Business staging views, procedures, functions and loaded data into DW and DM tables using BI-Ready ETL tool
- Prepared validation scripts to validate the data between the stages
- Validate the Tableau Reports data with SQL Server
- Experienced in performance tuning of SQL scripts/stored procedures
- Developed BCP scripts to load the data from results of target staging views target table
- Extensively used SSIS transformations like Lookup, Oledb command, Derived Column, Conditional splits, multicast, Send Mail Task, Execute SQL and Execute Package Task
- Scheduled SSIS packages using SQL Server Agent
- Involved in the deployment of SSIS packages
- Implemented package configurations, error handling, used different container tasks to process the data
- Developed import and export packages using import export wizards
- Implementation of SQL Server Management Studio to create complex SQL stored procedures, views, user defined functions to evaluate the results for the reports
- Developed, deployed and monitored SSIS packages
- Configured packages using package configuration wizard to allow packages to run different environments.
Environment: SQL Server 2008 R2, SSIS, Tableau, Alteryx.
Confidential
SQL Developer
Responsibilities:
- Analysed and reviewed the Business Requirements Specification and Solution Design with the Designers so as to have proper understanding of the requirements.
- Created Sources definitions from Flat Files and imported/created Targets definitions from the respective databases and created reusable Transformations.
- Transformed and conditioned the data into standard formats using SQL
- Worked on complex SQL stored procedures to derive new fields and solve various business requirements.
- Worked on performance enhancement of SQLs
- Proactively given support to Application Support Group for SQL jobs by developing hot fixes for failed jobs, monitoring the processing time for critical graphs and validating the data at the source and target ends.
- Participated in regular project status meetings.
Environment: Oracle, SQL, UNIX shell scripting