Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2001
…
4 pages
1 file
Optimising use of the Web (WWW) for LHC data analysis is a complex problem and illustrates the challenges arising from the integration of and computation across massive amounts of information distributed worldwide. Finding the right piece of information can, at times, be extremely time-consuming, if not impossible. So-called Grids have been proposed to facilitate LHC computing and many groups have embarked on studies of data replication, data migration and networking philosophies. Other aspects such as the role of 'middleware' for Grids are emerging as requiring research. This paper positions the need for appropriate middleware that enables users to resolve physics queries across massive data sets. It identifies the role of meta-data for query resolution and the importance of Information Grids for high-energy physics analysis rather than just Computational or Data Grids. This paper identifies software that is being implemented at CERN to enable the querying of very large collaborating HEP data-sets, initially being employed for the construction of CMS detectors.
Il Nuovo Cimento C, 2009
Performance, reliability and scalability in data management are cornerstones in the context of the computing Grid, where the volumes of data to be moved are huge, and the data analysis must be supported by high-performance and scalable storage resources. Nowadays, the data management issues are particularly important, considering the large data size and I/O load that the Large Hadron Collider (LHC) at CERN is going to produce. The Enabling Grids for Enabling e-Science (EGEE) EU funded project, where the Italian National Institute for Nuclear Physics (INFN) is a key member, is responsible to release and maintain the currently world's largest production grid, a sophisticated hierarchy of data management and storage tools developed to help physicist, but also other scientific communities, in the face of distributed data management problems. This paper reviews the main technologies employed for storage and data management in EGEE, and in the associated Worldwide LHC Computing Grid (WLCG) project.
International Journal of Modern Physics A, 2005
The DØ experiment at Fermilab's Tevatron will record several petabytes of data over the next five years in pursuing the goals of understanding nature and searching for the origin of mass. Computing resources required to analyze these data far exceed capabilities of any one institution. Moreover, the widely scattered geographical distribution of DØ collaborators poses further serious difficulties for optimal use of human and computing resources. These difficulties will exacerbate in future high energy physics experiments, like the LHC. The computing grid has long been recognized as a solution to these problems. This technology is being made a more immediate reality to end users in DØ by developing a grid in the DØ Southern Analysis Region (DØSAR), DØSAR-Grid, using all available resources within it and a home-grown local task manager, McFarm. We will present the architecture in which the DØSAR-Grid is implemented, the use of technology and the functionality of the grid, and the experience from operating the grid in simulation, reprocessing and data analyses for a currently running HEP experiment.
In 2006 when the CMS experiment will commence taking data, the CMS Offline computing will consist of a tiered hierarchical set of large computing facilities spread around the world. There will be a large, tier-0, central computing site near the experiment in CERN, five tier-1 regional centres in major research institutions and about 25 tier-2 centres at universities around the world. Around the year 2003 major elements of this structure will already be in place taking part in the enormous simulation effort that precedes the running experiment. Already CMS has computing and data management needs equivalent to that of one of today's experiments. This paper will describe the CMS requirements and activities for Grid computing, the work, which has already taken place and is in use in the present simulation efforts, and techniques that deal with the problems of heterogeneous data sources.
The Particle Physics Data Grid (PPDG) Collaboratory Pilot will develop, acquire and deliver vitally needed Gridenabled tools for data-intensive requirements of particle and nuclear physics. Novel mechanisms and policies will be vertically integrated with Grid middleware and experiment-specific applications and computing resources to form effective end-to-end capabilities. PPDG * is a collaboration of computer scientists with a strong record in distributed computing and Grid technology, and physicists with leading roles in the software and network infrastructures for major high-energy and nuclear experiments. Together they have the experience, knowledge and vision in the scientific disciplines and technologies required to bring Grid-enabled data manipulation and analysis capabilities to the desk of every physicist. A three-year program is proposed, taking full advantage of the strong driving force provided by now running physics experiments, ongoing Computer Science projects and recent advances in Grid technology. Our goals and plans are ultimately guided by the immediate, medium-term and longer-term needs and perspectives of the physics experiments, some of which will run for at least a decade from 2006 and by the research and development agenda of the CS projects involved in this Collaboratory Pilot and other Grid-oriented efforts. PPDG is actively involved in establishing the necessary coordination between complementary Data Grid initiatives in the U.S., Europe and beyond.
2013
Today's large-scale science projects all involve worldwide collaborations that must routinely move 10s of petabytes per year between international sites in order to be successful. This is true for the two largest experiments at the Large Hadron Collider (LHC) at CERN-ATLAS and CMS-and for the climate science community. In the near future, experiments like Belle-II at the KEK accelerator in Japan, the genome science community, the Square Kilometer Array radio telescope, and ITER, the international fusion energy experiment, will all involve comparable data movement in order to accomplish their science. The capabilities required to support this scale of data movement involve hardware and software developments at all levels: Fiber-optic signal transport, layer 2 transport, data transport (TCP is still the norm), operating system evolution, data movement and management techniques and software, and increasing sophistication in the distributed science applications. Further, ESnet's years of science requirements gathering indicates that these issues hold true across essentially all science disciplines that rely on the network for significant data transfer, even if the data quantities are modest compared to projects at the scale of the LHC experiments. This paper will provide some of the context and state-of-the-art of each of the topics that experience has shown are essential to enable large scale "data-intensive" science.
Journal of the Korean Physical Society, 2011
Policy Journal, τεύχος 3. Φεβρουάριος-Μάρτιος, 2024
Strumenti per la didattica e la ricerca, 2011
Media Education (ME) has come a long way. Today, it can no longer be considered a field of study reserved for semiotic and communication researchers. Nor can it be regarded as a privileged practice of those teachers, who for some reason consider media of fundamental importance. On one hand, ME is now part of the agenda of international organizations, which consider the development of media competences a necessary requisite to fully exercise citizenship in the current contemporary society. On the other, ME practices are becoming increasingly widespread in schools involving a growing number of teachers. Notwithstanding, teaching the media still seems to be a rather solipsistic task where «everything is fine». Indeed, in ME there is a tremendous lack of research concerning the educational practices' quality and effectiveness. This book tries to cope with these issues by providing a set of instruments to design, develop and evaluate ME activities in schools, and supporting the enhan...
Materials, 2023
The Journal of Theological Studies, 2017
South West Archaeological Journal 20, 2007
Developments in environmental biology of fishes, 1998
Ali Gamal Abd El-Tawab, 2023
Bulg.J.Phys.35 (2008) 323-328, 2007
SPIE Proceedings, 2012
Social Science Research Network, 2013
Colloid and Interface Science Communications, 2016
Indian Journal of Otolaryngology and Head & Neck Surgery, 2015
Nenji Taikai, 2012
Electrochimica Acta, 2005
The Oxford Handbook of Mobile Communication and Society