Technology Architect Resume
SUMMARY
- A versatile, innovative and skilled IT professional having more than 19 years of IT experience in both Support and Development engagements with focus on Data Warehouse/Business Intelligence in Traditional and Big Data Technology landscape.
- I have 10+ years of rich experience in Data warehousing projects.
- Possess hands - on experience in EDW implementation projects. Also possess excellent track record of working on Projects on the global delivery model, expertise in T&M and FP projects with critical timelines and volatile requirements.
- This includes significant experience in ETL development process in DW lifecycle with Specialization in Teradata, Netezza, Big Data, Informatica and Mainframe tools.
- Around 6 years of International experience in customer location spanning various states in Confidential .
- I have experience with Migration projects and having hands on experience with Big Data Technologies.
TECHNICAL SKILLS
Key Skills: Big Data Technologies, Development of data lakes on big data Platform.BI/DW development for EDW and data mart implementation, Data warehouse Estimation, ETL design and implementation for BI, Development and enhancement of legacy based systems.
Technology: Teradata 14.0, ETL (Informatica 9.0.2 - Admin and Developer), DWH, SQL, Mainframe, Shell Scripting, HDFS, HIVE, Tableau, AWS Cloud
Functional Areas: Financial Services, Banking, Retail, Telecom, Media and Communication, Securities, Manufacturing, HR and Payroll, Health Care.
Quality tools: Remedy, SharePoint, Clarify, Kintana, Rally
Databases: Teradata 14.0, Netezza 7.0, MS SQL Server 2008,DB2, IDMS/R, VSAM, MS Access.
Improvement/Optimization Techniques: Six Sigma (Green Belt), Lean.
PROFESSIONAL EXPERIENCE
Technology Architect
Confidential
Responsibilities:
- Prepare Design documentation such as program code, and technical documents for code conversion from Informatica to Redshift Queries.
- Work with Infrastructure team to resolve issues associated with AWS cloud Services and open relevant tickets.
- Work with the Product owner to resolve Project issues.
- Hands on development of Redshift Queries.
- Develop and maintain installation and configuration procedures.
- Attending Daily Standup Calls and Status Reporting.
Confidential
Teradata Architect and Big Data designer
Responsibilities:
- Working on multiple enhancement projects for EDM.
- Produces technical and business solutions architectures (logical and physical)
- Leading team of developers and troubleshooting issues faced during development and implementation phase.
- Achieved 90% load gains and 50%-70% query execution time reduction and delivered overall system resource usage improvement by developing and executing data warehouse risk management, reduced load times, and front-end reporting.
- Data analysis, understanding of business requirements and translation into logical pipelines & processes.
- Evaluate and prototype new technologies in the area of data processing
- Design and implement data archiving strategy.
- Understanding and documenting existing process in the Teradata Datawarehouse.
- Work on estimation and sizing for new project requirements.
- Discussion with the Client Architect team and other stakeholders.
- Evaluate tools for use case fit, perform vendor/tool comparisons and present recommendations.
- Client Interaction and Status Reporting on a daily basis.
Confidential
ETL Designer and Developer
Responsibilities:
- Creating End to End Data Flow for Data Masking and loading into lower environment.
- Creating Shell Scripts to call Netezza load scripts and TDM workflows.
- Development on Informatica TDM and Creating masking rules in TDM.
- Work on estimation and sizing for new project requirements.
- Client Interaction and Status Reporting on a daily basis.
Confidential
Big Data Designer and Developer
Responsibilities:
- Creating logical model for organized Data Lake.
- Designing, integrating and documenting technical components for seamless data extraction and analysis on big data platform.
- Physical implementation on creating Hive data model
- Performing all of the necessary data transformations to populate data into Hadoop based systems like Hive, HBase, Pig.
- Hands On experience with Creating HQL Scripts and write Pig Load scripts
- Development on Datameer components with datalinks, workbooks and Export Jobs.
- Ensuring best practices that can be adopted in Big Data stack and share across teams.
- Work on estimation and sizing for new project requirements.
- Client Interaction and Status Reporting on a daily basis.
- Creation of sliders for Customer Presentations and meetings