Institute for Data and High Performance Computing - PDF

Please download to get full document.

View again

of 16
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Information Report

Health & Lifestyle


Views: 8 | Pages: 16

Extension: PDF | Download: 0

Related documents
Institute for Data and High Performance Computing Innovation Starts Here Growing the Research Enterprise Community Building and Partnerships Technical Support Across Domains Events, Training and Education
Institute for Data and High Performance Computing Innovation Starts Here Growing the Research Enterprise Community Building and Partnerships Technical Support Across Domains Events, Training and Education Big Data and High Performance Computing (HPC) are highly interdisciplinary fields that are evolving quickly and having fundamental impact in our technology-driven society. The Institute for Data and HPC at Georgia Tech is developing and leveraging research in these fields to steer the course of innovation and create new capabilities in computing advancement and economic growth. IDH is creating the technologies that are propelling Big Data and HPC forward. Through its NextGen Codes Initiative IDH has seeded the development of: scalable machine learning libraries, quantum chemistry codes, seismic-detection algorithms, and GPU-enabled computing tools for nuclear reactor designs, among others. IDH launched two new centers in the past year: the Center for Data Analytics and the Center for High Performance Computing: From Big Data to Exascale Computing. These activities are creating synergies between computational experts and domain specialists and help to define Georgia Tech s competitive advantage. IDH is helping to define a future path for massive data and HPC research at Georgia Tech by developing networks of researchers and industry partners and fostering new opportunities in these burgeoning fields. As a partner in the Georgia Tech Research Institute s Big Data, Analytics and HPC Strategic Initiative, IDH is helping foster connections with academic researchers across campus in areas such as sustainability and energy, materials and manufacturing, bioscience and biomedicine, and security. IDH is actively exploring how to integrate Big Data and HPC into the larger Georgia Tech research culture to become a catalyst for innovation. Collaborations with industry partners and national sponsors are a key part of growing the research enterprise. Members of the research community have partnered with several federal agencies and Fortune 500 companies to meet their needs in a number of areas. IDH is taking on a specialized role to make computational advances readily available to domain scientists through software libraries and tools. This allows researchers to substantially invest in these tools, transfer them to new research pursuits, and gain access to a rich repository of managed Big Data and HPC technology. IDH supports the Keeneland Full Scale System, a Georgia Tech-led project establishing the most powerful GPU supercomputer available for research through NSF s Extreme Science and Engineering Discovery Environment (XSEDE) program. IDH supports collaboration and innovation by bringing together practitioners and thought leaders from Georgia Tech s leading areas through research events and activities. IDH provides support for training that advances researchers opportunities to apply new technical skills and best practices. The institute offers short courses, workshops and distance education opportunities with industry-leading groups and research organizations. IDH enables knowledge transfer into targeted domain areas and creates impact by helping researchers identify realworld problems and providing solutions with state-of-the-art methodologies and tools. Richard Fujimoto: IDH Interim Director Positioning for Future Research Opportunities IDH is Georgia Tech s external face for massive data and HPC activities and research on campus. A dynamic research community is visible to a variety of stakeholders through IDH, enhancing Georgia Tech s ability to collaborate with industry and government. IDH demonstrates Georgia Tech s commitment to Big Data, one of 12 core research areas shaping Tech s vision to be a leader in the regional innovation ecosystem. IDH supports industry events including a tradeshow booth at the annual Supercomputing Conference to enhance Georgia Tech s visibility and marketing efforts in Big Data and HPC. IDH also develops and maintains news and marketing initiatives related to Georgia Tech projects and researchers advancing the Big Data and HPC fields. Institute for Data and High Performance Computing Contact: IDH Interim Director Richard Fujimoto, Ph.D. Computational Science & Engineering Institute for Data and High Performance Computing Georgia Tech Klaus Advanced Computing Building 266 Ferst Drive NW Atlanta, GA Fax: IDH provides a platform for innovation in Big Data and HPC to tackle society s most important and challenging problems. T A B L E O F C O N T E N T S 2 XDATA Program In a major research drive to crack Big Data, Georgia Tech is partnering on multiple federal projects, including the XDATA program from DARPA. XDATA is designed to create open source software tools to assimilate and process mountains of data to analyze trends and glean value from the data. 3 FLAMEL Traineeship Program Funded by the NSF s IGERT program, FLAMEL trains a new type of data scientist capable of creating advanced materials and bringing them to market at a fraction of the time it now takes. 4 New Research Centers Create Growth The Center for Data Analytics and the Center for High Performance Computing: From Big Data to Exascale Computing are two of Georgia Tech s emerging technology groups that are at the heart of innovation in the computing research space. 6 Next Generation Initiatives New methods and tools in Big Data and HPC are critical to advancing major scientific research and economic growth. IDH s seed grant program has resulted in next generation initiatives designed to push the limits of scientific discovery. 8 Computational Biology Research projects in the sciences and biomedical fields are a growing part of Georgia Tech s portfolio of Big Data innovations, leading to important medical breakthroughs and state-of-the-art computational tools. 10 Engagement with Industry IDH is helping to define a future path for massive data and HPC research at Georgia Tech by developing networks of researchers and industry partners to foster new opportunities. 11 Partnerships and Training IDH is leading the development of a growing network of Big Data and HPC research partners across campus, ensuring collaboration and knowledge transfer. 12 Positioning for Future Growth As one of Georgia Tech s 12 core research areas, Big Data is a vital part of Tech s vision as a leader in the innovation ecosystem. IDH s outreach and marketing continues to enhance this position. XDATA Aims to Extract Knowledge from Growing Digital Data Georgia Tech Team Wins $2.7 Million Award to Advance Big-Data Technology for DARPA The XDATA award is part of a $200 million multi-agency federal initiative for Big-Data research and development announced in March The initiative is aimed at improving the ability to extract knowledge and insights from the nation s fast-growing volumes of digital data. Numerous Big-Data-related research endeavors are underway at Georgia Tech. A research team led by Haesun Park, Professor in the School of Computational Science and Engineering and Director of the newly formed Center for Data Analytics, received a $2.7 million award from the Defense Advanced Research Projects Agency (DARPA) to develop technology intended to help address the challenges of Big Data. The contract is part of DARPA s XDATA program, a four-and-half year research effort to develop new computational techniques and open-source software tools for processing and analyzing data, motivated by defense needs. Selected by DARPA to perform research in the area of scalable analytics and data-processing technology, the team focuses on producing novel machine-learning approaches capable of analyzing very large-scale data. In addition, team members are pursuing the development of distributed computing methods that can implement data-analytics algorithms very rapidly by simultaneously utilizing a variety of parallel-processing environments and networked distributed computing systems. The algorithms, tools and other technologies developed will be open source, to allow for customization. Under the opensource paradigm, collaborating developers create and maintain software and associated tools. Program source code is made widely available and can be improved by a community of developers and modified to address changing needs. 2 The Georgia Tech XDATA effort builds upon foundational methods and software developed under the Foundations of Data and Visual Analytics (FODAVA) research initiative, a 17-university program led by Georgia Tech and funded by the National Science Foundation and the Department of Homeland Security. Richard Fujimoto FLAMEL: From Learning, Analytics, and Materials to Entrepreneurship and Leadership Doctoral Traineeship Program The FODAVA effort has produced multiple innovative visual analytics software systems such as the FODAVA research test bed, Visual Information Retrieval and Recommendation System (VisIRR) and other tools for interactive classification, clustering and topic modeling tasks. The FODAVA document retrieval and recommendation system uses automated algorithms to give users a range of subjectsearch choices and information visualization capabilities in an integrated way, so that users can interact with the data and information throughout the problem-solving process to produce more meaningful solutions, said Park, who is also FODAVA s director. For XDATA, we will enhance these visualization and interaction capabilities and develop distributed algorithms that allow users to solve problems faster and on a larger scale than ever before. Also participating from the School of Computational Science and Engineering are Professor Hongyuan Zha and Research Scientist Jaegul Choo, who has previously led development of visual analytics systems on the FODAVA project. Investigators from the Georgia Tech Research Institute (GTRI) also contribute to the XDATA initiative. Senior Research Scientists Barry Drake and Richard Boyd are responsible for handling the computational demands of implementing the data analytics algorithms being developed. GTRI s task involves enabling these algorithms to run on a networked distributed computing system. By configuring the software to operate on multiple processors simultaneously, the researchers believe they can ensure that the algorithms solve large-scale problems very rapidly a requirement of the DARPA award. The GTRI team is applying the latest advances in high performance numerical libraries to speed up the underlying computations of the higher-level data analytics algorithms and building tools to integrate the data analytics into a professional open-source package. Georgia Tech to Exploit Big Data for Accelerating Materials Design and Manufacture through FLAMEL Traineeship Program Georgia Tech has been awarded $2.8 million from the National Science Foundation to start a program to train a new type of data scientist capable of creating advanced materials and bringing them to market at a fraction of the time it now takes, typically 15 to 20 years. The goal of this program is to employ advances in big data and information technology to significantly reduce the timelines now required for new materials to be created and incorporated into commercial products, said School of Computational Science and Engineering Chair and Regents Professor Richard Fujimoto, the principal investigator for the grant. The program will be transformational in bringing big data researchers together with materials scientists, engineers, and mathematicians to quantify the microstructures that comprise materials and develop new algorithms and software for their design, said Fujimoto. The five-year program will provide funding for 24 doctoral trainees but is expected to create educational opportunities that will impact hundreds of Georgia Tech students. The new program includes a focus on entrepreneurship to enable graduate trainees to transform technical innovations into commercial products and services. Called FLAMEL From Learning, Analytics, and Materials to Entrepreneurship and Leadership, the program will leverage Georgia Tech s recent investment in MatIN, a cyberinfrastructure platform designed to enable rapid interdisciplinary collaboration in materials development and manufacture. More information is available at The program is funded through NSF s Graduate Education and Research Traineeship (IGERT) program, award number DGE New Research Centers are Foundation for Innovation Two new research centers spawned by IDH are becoming leaders among Georgia Tech s emerging technology groups at the heart of research innovation in Big Data and HPC. The Center for Data Analytics (CDA) is Georgia Tech s newest lead in the design and development of a unified community for Big Data and analytics. The center, led by Professor Haesun Park, School of Computational Science and Engineering, is focused on enabling scientific advancement of today s challenging Big Data problems and providing integrated leadership on data analytics. CDA is harnessing the institute s strengths by bringing together a large number of faculty practicing foundational disciplines in the Big Data research space. A broad and deep range of subject-matter expertise including machine learning, modeling and simulation and data visualization will allow collaborators regionally and globally to investigate new directions anywhere within scientific inquiry. CDA is designed to bring more opportunities to Georgia Tech from external scientists, government agencies and large corporations to tackle scientific problems that cannot be solved without emerging computational toolsets. Haesun Park FODAVA Defines Foundations for Visual Analytics Research Field The Foundations on Data Analysis and Visual Analytics (FO- DAVA) research initiative, a 5-year project led by Georgia Tech and under the direction of Professor Haesun Park, School of Computational Science and Engineering, is the national genesis for defining the computational foundations for data and visual analytics fields. FODAVA is engaged in a collaborative effort among 17 partner institutions funded jointly by the National Science Foundation and the Department of Homeland Security. FODAVA performs foundational research and investigates ways to improve visual analytics of massive data sets through advances in areas such as machine learning, numeric and geometric computing, optimization, computational statistics and information visualization. The initiative established data and visual analytics as a distinct research field and built a dynamic community of researchers collaborating through foundational research, research workshops, conferences, industry engagement and technology transfer. FODAVA-generated collaborations have continued to build, and technical reports as well as other research data from the national initiative may be found at New Tools Weight Connections Between Large-Scale Data in High Dimensional Space A research team led by Computational Science and Engineering Professor Haesun Park is taking on the Big Data challenge through an approach that finds unseen structures in vast amounts of high dimensional data and represents them in the limited two-dimensional screen space, allowing interaction with the data to achieve more meaningful solutions. This new foundational method for clustering data is based on characterization of the space in which the data resides and simultaneously reduces the dimensions, so that the data is represented using only a smaller number of collapsed and informative features. One example might be a database of research paper collections too numerous for manual viewing. Using a tool such as Georgia Tech s UTOPIAN (User-driven Topic Modeling Based on Interactive Nonnegative Matrix Factorization), the documents are simultaneously categorized into clusters (e.g., articles about similar topics stay together) while incorporating the user s prior knowledge through visual interaction. Park believes this clustering method could be applied in a number of ways because it captures meaningful cluster structure more naturally than completely automated methods by allowing human knowledge to guide the solution process. 4 The Center for High Performance Computing (HPC): From Big Data to Exascale Computing, led by Executive Director David A. Bader, is designed to bring together Georgia Tech s interdisciplinary research and education activities in high performance computing in order to better leverage Georgia Tech s capabilities in this area and enable solving grand challenges in computational science and engineering. David A. Bader The center is strategically focused to meet the demands of federal priorities from multiple agencies over the next five years that are focused on high performance computing and Big Data. The center recognizes that the expanding cyberinfrastructures within organizations provide for new HPC capabilities, which are essential, not optional, to the aspirations of research communities. The Georgia Tech Center for HPC is focused on being a dominant innovator in HPC and massive data technology and a creator of software and tools enabling and accelerating discovery and innovation in targeted application domains. DARPA Awards Georgia Tech Energy-Efficient High-Performance Computing Contract Georgia Tech is in the first phase of a cooperative agreement contract from the U.S. Defense Advanced Research Projects Agency (DARPA) to create the algorithmic framework for supercomputing systems that require much less energy than traditional high-speed machines, enabling devices in the field to perform calculations that currently require room-sized supercomputers. Awarded under DARPA s Power Efficiency Revolution for Embedded Computing Technologies (PERFECT) program for $561,130 for phase one of a negotiated three-phase $2.9 million contract the cooperative agreement contract is one piece of a national effort to increase the computational power efficiency of embedded systems by 75-fold over the best current computing performance in areas extending beyond traditional scientific computing. Computational Science and Engineering Professor David Bader is principal investigator on the Georgia Tech cooperative agreement, along with research scientist and co-pi Jason Riedy. The project bears the acronym GRATEFUL: Graph Analysis Tackling power-efficiency, Uncertainty and Locality. Such a system would have benefits in energy conservation and improve tactical advantages of supercomputing in military situations. New Georgia Tech Software Recognizes Key Influencers Faster Than Ever Determining the most influential person in a social media network is complex. Thousands of users are interacting about a single subject at the same time and new people are constantly joining the streaming conversation. A Georgia Tech team, led by Computational Science and Engineering Professor David Bader, has developed a new algorithm that quickly determines the most connected person in a network. The algorithm can identify influencers as information changes within a network, making it a first-of-itskind tool. The algorithm takes existing graph (network) data and does the bare minimal computations affected by the inserted edges or connections. The measurement for how connected someone may be in a graph can be computed more than 100 times faster in some cases using the Georgia Tech software. Despite a fragmented social media landscape, data analysts would be able to use the algorithm to look at each social media network and mark inferences about a single influencer across these different platforms, said Bader. The project is supported by the National Science Foundation (Award Number CNS ).The open source software is available to businesses. 5 Next Generation Initiatives Interdisciplinary Georgia Tech teams in various research areas, such as chemistry, nuclear energy, earthquake detection and large-scale data
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks