Technical Lead - Data Warehouse Resume
SUMMARY:
- ETL Developer and Project Lead with more than 13 years’ experience in the IT field with strong knowledge in Design, Development and Testing
- Experienced in working on Open Source using C, Shell Scripts, Python, Core Java, SQL, and Scripting in Linux/Unix environments.
- Extensive experience in various domains like Banking/ Retail/ Financial and Manufacturing
- Good working experience and understanding on Development Processes & Methodologies (Water Fall, Scrum, Kanban) and Quality Assurance.
- Proficient in analyzing and translating business requirements to technical requirements and architecture
- Extensive knowledge in Data Ware Housing, Project Management, International Code Compliance and Interpretation, Quality Assurance and Control, Project Analysis.
- Worked extensively on databases like Oracle, IBM Netezza and Postgres.
- Have good knowledge of Banking Applications (worked more extensively on Loans) and Financial Services and Agile Safe framework.
- Experience with source code version control systems such as GIT and CVS (Concurrent Versions System).
- Worked on Hadoop platform, also have good knowledge in Big Data techs like Pig, Hive, and Sqoop .
- Used Pig to process transaction details and generate reports accordingly.
- Completed ITIL certification
- Experience in coordinating working with geographically distributed teams and mentoring team to resolve dependencies to meet project timelines
- Good Knowledge on SQL queries and creating database objects like Stored Procedures, Triggers, and Packages.
- Requirement Analysis - Involved and addressed discussion forums like PRD
- Discussion, Stake Holder’s Discussion, SWAG or Estimation Discussion, Post Mortem Analysis Discussion, SOW Discussion
- Development - Involved in all work flow stages, Analysis & Design, Coding, Google Test, Code Review, Bug Tracking.
- Meetings - Team meeting, Daily standup Scrum meeting, Onsite Coordination call meeting
- Documentation - Responsible for managing the document deliverables like Technical Design Document, High Level & Low Level design document,
- Extensively worked on various Data warehouse projects using Unix Shell Scripts, C; and have working knowledge on Informatica Client Tools, splunk, Python, PERL, Syncsorts. Good experience in writing shell scripts to implement along with ETL code for process execution, quality assurance and job status notification.
- Currently, Managing the Core Development and Data Warehouse Production Support/Help Desk - which involves estimation of efforts, Planning of coverage and support, Managing the resources at both onsite and offshore ensuring the delivery of the 24/7 support of the environment meeting the agreed SLA time lines
- Adept at working with diverse technical groups and end users to develop technical solutions that meet or exceed expectations along with good analytical, communication and problem solving skills.
- Key resource in various projects from offshore deliveries and fast in understanding the technical environment and adaptive /quick turnaround in giving productivity.
TECHNICAL SKILLS:
Database Systems: Oracle Exadata, IBM Netezza, Oracle 11g, SQL Server 2012/2008R2
Programming Languages: C, Core Java, Pro *C, C++, HTML
Scripting Languages: Shell Scripting, Perl, Python, JavaSrcipt
Operating Systems: UNIX, Linux, Solaris, AIX, Windows-XP/2000/NT
Tools (ETL/BI0: Informatica Powercenter, Eclipse, SSIS 2012/2008, SSAS 2012/2008
IDE: Eclipse, Vi editor, notepad++
Tools: Confluence, Eclipse, QTCreator, CFT, Syncsort, ALM, Git, Rally, HUE, Tivoli, Control-M
Big data Technologies: Hadoop, Hive, Sqoop, Impala, Spark, HDFS, MapReduce
PROFESSIONAL EXPERIENCE
Technical Lead - Data Warehouse
Confidential
Responsibilities:
- Analysis of the specifications provided by the clients, and implement using the ETL technologies such as SyncSort, Informatica IDQ, UNIX in agile and non agile methodologies
- Worked with data investigation, discovery and mapping tools to scan every single data record from many sources and loading into Oracle, Netzza, Hadoop platforms using C, shell script, Syncsorts etc
- Writes ETL (Extract / Transform / Load) processes, designs database systems and develops tools for real-time and offline analytic processing
- Implementing/Monitoring jobs on hadoop ecosystem, using pig, fluid, sqoop, hive, hdfs, python and impala commands
- Maintain and integration of data from multiple data sources in Hadoop cluster using Big Data querying tools, such as Pig or Hive
- Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
- Troubleshooter for test scripts, SQL queries, data warehouse/data mart/data store models.
- Design and Development of scripts to enhance performance of tasks On-premises ETL that involves sources from relational databases such as DB2 Mainframe, heterogenous sources likes csv, xml, flat files and trasforming them to consumble date through DMX, lookup etc
- Worked on databases and need to write query to increase performance i.e., Query Optimizer.
- Analyzing and fixing critical and performance related issues during the testing phase of the product.
- Reviewed the design documents and code for various enhancements.
- Provide application support to the existing batches and also involved in implementations and TRTs
Module Lead
Confidential
Responsibilities:
- Analysis of the specifications provided by the clients
- Design and Development of a few major modules like Trend line.
- Coding using C, Shell Script and Perl, and deployed in the LINUX environment.
- Enhancement of reports like Stock Guide, Trend Line, Earnings Guide etc.
- Testing - unit testing & integration testing
- Responsible for overseeing the Quality procedures related to the project.
- Responsible for the change management and source code management for the project.
- Analysis of the bugs reports when the product went live and also worked towards bringing the bugs to closure.
- In charge of cross-training between teams.
- Initiated new process for the knowledge enhancement of the team.
- Prepared weekly and monthly status reports and DCF reports,Prepared Lessons learned documents and Self Support documents.
Systems Analyst
Confidential
Responsibilities:
- Analysis of the specifications provided by the clients
- Design and Development of a few major modules like Repayment Setoff Logic (both vertical and horizontal), Delinquency Calculations and Loan Booking.
- Coding using Java, XML and Java Script, and deployed in the LINUX environment.
- Development of reports like Trail balance report, User Profile Maintenance Report.
- Testing - unit testing & integration testing
- Responsible for overseeing the Quality procedures related to the project.
- Responsible for the change management and source code management for the project
Developer
Confidential
Responsibilities:
- Analysis of the specifications provided by the clients
- Writing Detailed Design (LLD) for the basic menu options.
- Coding using Java, C, C++, XML and Java Script in the UNIX environment.
- Testing - unit testing & integration testing
- Analysis of the bugs and also worked towards bringing the bugs to closure.