Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2018
…
37 pages
1 file
2013, 2014
DNS maps the complex IP addresses, making it crucial for internet communication for users and applications. DNS, an integral component of internet, is faced with many challenges. Enormous data growth and inherent security weakness demands continuous monitoring and performance measurement of DNS traffic. DNSSEC can improve security at the cost of DNS performance. This tradeoff needs to be evaluated before actual implementation. DNS performance measurement is critical for this evaluation. IPv6 will increase load on DNS exponentially, resulting in greater need for DNS performance evaluation. Despite the critical need for DNS performance measurement, very little research has happened on DNS performance measurement. Most of this little research has been done on client and authoritative layers of DNS. The most vulnerable and functionally important Caching layer performance measurement is hugely under researched. There are some software and hardware techniques available for DNS performance measurement. The software techniques provide burst of unrealistic data for limited time, restricting the scope of performance measurement and evaluation. Hardware data generation systems can be both expensive and inflexible. There is a need for more realistic DNS data traffic which can be used for flexible and cost effective performance measurement over a longer period for thorough evaluation.
Lecture Notes in Computer Science, 2002
This paper presents the design of a next generation network traffic monitoring and analysis system, called NG-MON (Next Generation MONitoring), for high-speed networks such as 10 Gbps and above. Packet capturing and analysis on such high-speed networks is very difficult using traditional approaches. Using distributed, pipelining and parallel processing techniques, we have designed a flexible and scalable monitoring and analysis system, which can run on off-the-shelf, cost-effective computers. The monitoring and analysis task in NG-MON is divided into five phases; packet capture, flow generation, flow store, traffic analysis, and presentation. Each phase can be executed on separate computer systems and cooperates with adjacent phases using pipeline processing. Each phase can be composed of a cluster of computers wherever the system load of the phase is higher than the performance of a single computer system. We have defined efficient communication methods and message formats between phases. Numerical analysis results of our design for 10 Gbps networks are also provided. Lossless packet capture: We need to capture all packets on the link without any loss so to provide required information to various applications. Flow-based analysis: When analyzing, it is better to aggregate packet information into flows for efficient processing. By doing this, packets can be compressed without any loss of information. Consideration of limited storage: The amount of captured packets in high-speed networks is more than hundreds of megabytes per minute even though being aggregated into flows [2]. An efficient method is needed for storing these large amounts of flows and analyzed data in the limited storage. Support for various applications: It should be flexible enough to provide data to various applications in diverse forms. When a new application needs to use the system, it should be able to easily support the application without changing the structure of the system.
2011
ETOMIC is a network traffic measurement platform with high precision GPS-synchronized monitoring nodes. The infrastructure is publicly available to the network research community, supporting advanced experimental techniques by providing high precision hardware equipments and a Central Management System. Researchers can deploy their own active measurement codes to perform experiments on the public Internet. Recently, the functionalities of the original system were significantly extended and new generation measurement nodes were deployed. The system now also includes well structured data repositories to archive and share raw and evaluated data. These features make ETOMIC as one of the experimental facilities that support the design, development and validation of novel experimental techniques for the future Internet. In this paper we focus on the improved capabilities of the management system, the recent extensions of the node architecture and the accompanying database solutions.
IEEE Eighth International Conference on Signal Image Technology and Internet Based Systems (SITIS 2012), 25-29, November 2012 - Sorrento - Naples, Italy.
Understanding the ever-changing scenario of computer networks and how they operate in the real world implies measuring and analyzing their characteristics. This in turn requires a set of advanced tools and methodologies to be shared among researches, along with the data derived from such activities. In this paper we first present some of the main issues and challenges in the field of Internet Monitoring and Measurement, then we present several open source platforms we have developed in the last 10 years for monitoring heterogeneous and large scale networks. Finally, we describe some of the data sets we made publicly available to the research community.
… , 2009. ICC'09. IEEE …, 2009
The domain name service (DNS) provides a critical function in directing Internet traffic. Defending DNS servers from bandwidth attacks is assisted by the ability to effectively mine DNS log data for statistical patterns. Processing DNS log data can be classified as a ...
2010 IEEE Globecom Workshops, 2010
The ever-increasing complexity and diversity of the Internet pose several challenges to network operators and administrators and, in general, Internet users. More specifically, because of the diversity in applications and usage patterns; the prevalence of dynamic IP addresses and applications that do not conform to standard configuration (e.g. VoIP to bypass firewalls), monitoring and securing networks and end hosts is no longer a trivial task. In this paper, we propose Host and netwOrk System Profiler and Internet Traffic AnaLysis (HOSPITAL): a tool for the summarization, characterization of traffic and the troubleshooting of potential suspicious activities. HOSPITAL provides the network operator as well as the user with knowledge about applications, communicating parties, services required/provided, etc, at different levels of granularity (e.g. individual hosts, /24 blocks, a large enterprise, etc), all presented concisely with an easy to use web interface. Moreover, HOSPITAL is a lightweight self-contained tool that incurs little overhead with configuration and customization capabilities for users and developers.
Para mis hijos David, Michael y Jonathan, que saben detectar la falsedad. No cobres por este recurso, ni cobres membresía para compartirlo. Este recurso es completamente GRATIS!!! www.tronodegracia.com
Eskiyeni, 2024
The debate over the unity or division of sciences is a significant topic in the history and philosophy of science. From ancient philosophers to today, scientists have attempted to unify, classify, or segment the sciences. Greek philosophers approached this issue through the concepts of “One” and “Many.” For instance, Parmenides focused on static substances, whereas Heraclitus emphasized becoming and flux. Empedocles pointed to the four elements, Democritus to atoms, Pythagoras to numbers, Plato to forms, and Aristotle to categories. In the Islamic world, Ibn Khaldun expanded the unity of sciences through social sciences, while Avicenna classified sciences based on practical and universal aspects. With the return to nature in the 16th century, sciences were reshaped through natural sciences. F. Bacon emphasized the importance of experiment and observation, categorizing sciences in a pyramid. Galileo proposed that nature is structured on mathematical symbols, while Descartes and Leibniz developed their scientific views based on Newtonian physics. Kant evaluated sciences not by searching behind phenomena but as a cognitive unity based on principles. In the modern era, positivism gained prominence. Since the 19th century, thinkers of the Vienna Circle advocated for the unity of sciences, reducing all epistemic activities to positive science. These thinkers envisaged scientific philosophy grounded in physical science. Reductionism was considered the most important method for achieving the unity of sciences. This paper will critique Fodor’s physicalism based on reductionism and defend Dupre’s idea of the disunity of sciences. Dupre argues that the reductionist method cannot be applied to special sciences like biology and that, although occasionally interacting, sciences should be considered independent modes of knowledge with their specific domains and methods. The paper will begin by presenting the historical development of the unity of sciences within the context of the philosophy of science, focusing on the views of the Vienna Circle philosophers. It will examine the thoughts of neo-positivist philosophers such as Carnap (1928, 1934) and Nagel (1961) on the unity of sciences, Hempel’s nomological-deductive explanation model (1965), and the claims of Oppenheim/Putnam (1958) regarding the unity of sciences. Subsequently, Fodor’s views on the unity of sciences will be evaluated, and physicalism based on reductionism will be analysed. After discussing the shortcomings and errors of this view, Dupre’s argument for the disunity of sciences will be examined, particularly highlighting the inapplicability of reductionism in special sciences like biology. Ultimately, it will be argued that, rather than achieving a unified science reduced to physical laws, the contemporary understanding of science is better served by recognizing the distinct and autonomous nature of different scientific fields.
Fashion Highlight, 2023
The current article focuses on emerging transformations in fashion creative processes in regard of the enhancement of digitalization processes, the opportunities offered by new sustainable business models, and a new relation between user, production, and consumption. In particular, this article discusses the case of the Italian manufacturing districts where digitalization strongly pervades the production, integrating the local craftsmanship savoir-faire with up-to-date technologies. Within the strategic duo "fashion and technology", we highlight emerging opportunities for the integration of creative processes and manufacturing skills. Moreover, the need for sustainable practices offers new significant insights into the integration of the roles of the designer and manufacturing processes. Moving from this discussion, the article presents an overview of ongoing transformations of fashion design practices in relation to the technological and social issues, offering a framework to read the articles included in this issue.
Philosophia, 2022
Huisarts en Wetenschap, 2009
Scientific Reports, 2020
Romanitas – Revista de Estudos Grecolatinos, 2024
Journal of Marine Science and Engineering, 2021
Jurnal Manajamen Informatika Jayakarta
Journal of Water and Land Development
Economist&Jurist, 2024
2020
Journal of Medical Genetics
Journal of Glaciology
economics.uci.edu
Prace Naukowe Uniwersytetu Ekonomicznego we Wrocławiu, 2015
MATEC Web of Conferences, 2018
PSIKODIMENSIA
PONTE International Scientific Researchs Journal
Journal of Econometrics, 2005