Search results

Filters

  • Journals
  • Authors
  • Keywords
  • Date
  • Type

Search results

Number of results: 109
items per page: 25 50 75
Sort by:
Download PDF Download RIS Download Bibtex

Abstract

The problem that this paper investigates, namely, optimization of overlay computing systems, follows naturally from growing need for effective processing and consequently, fast development of various distributed systems. We consider an overlay-based computing system, i.e., a virtual computing system is deployed on the top of an existing physical network (e.g., Internet) providing connectivity between computing nodes. The main motivation behind the overlay concept is simple provision of network functionalities (e.g., diversity, flexibility, manageability) in a relatively cost-effective way as well as regardless of physical and logical structure of underlying networks. The workflow of tasks processed in the computing system assumes that there are many sources of input data and many destinations of output data, i.e., many-to-many transmissions are used in the system. The addressed optimization problem is formulatedin the form of an ILP (Integer Linear Programing) model. Since the model is computationally demanding and NP-complete, besides the branch-and-bound algorithm included in the CPLEX solver, we propose additional cut inequalities. Moreover, we present and test two effective heuristic algorithms: tabu search and greedy. Both methods yield satisfactory results close to optimal.
Go to article

Authors and Affiliations

Krzysztof Walkowiak
Andrzej Kasprzak
Karol Andrusieczko
Download PDF Download RIS Download Bibtex

Abstract

We address one of the weaknesses of the RSA ciphering systems i.e. the existence of the private keys that are relatively easy to compromise by the attacker. The problem can be mitigated by the Internet services providers, but it requires some computational effort. We propose the proof of concept of the GPGPU-accelerated system that can help detect and eliminate users’ weak keys. We have proposed the algorithms and developed the GPU-optimised program code that is now publicly available and substantially outperforms the tested CPU processor. The source code of the OpenSSL library was adapted for GPGPU, and the resulting code can perform both on the GPU and CPU processors. Additionally, we present the solution how to map a triangular grid into the GPU rectangular grid – the basic dilemma in many problems that concern pair-wise analysis for the set of elements. Also, the comparison of two data caching methods on GPGPU leads to the interesting general conclusions. We present the results of the experiments of the performance analysis of the selected algorithms for the various RSA key length, configurations of GPU grid, and size of the tested key set.

Go to article

Authors and Affiliations

Przemysław Karbownik
Paweł Russek
Kazimierz Wiatr
Download PDF Download RIS Download Bibtex

Abstract

In this paper we show how formal computer science concepts—such as encoding, algorithm or computability—can be interpreted philosophically, including ontologically and epistemologically. Such interpretations lead to questions and problems, the working solutions of which constitute some form of pre-philosophical worldview. In this work we focus on questions inspired by the IT distinction between digitality and analogicity, which has its mathematical origin in the mathematical distinction between discreteness and continuity. These include the following questions: 1) Is the deep structure of physical reality digital or analog, 2) does the human mind resemble a more digital or analog computational system, 3) does the answer to the second question give us a cognitively fruitful insight into the cognitive limitations of the mind? As a particularly important basis for the above questions, we consider the fact that the computational power (i.e., the range of solvable problems) of some types of analog computations is greater than that of digital computations.

Go to article

Authors and Affiliations

Paweł Stacewicz
Download PDF Download RIS Download Bibtex

Abstract

A Computational Intelligence (CI) approach is one of the main trending and potent data dealing out and processing instruments to unravel and resolve difficult and hard reliability crisis and it takes an important position in intelligent reliability analysis and management of data. Nevertheless, just few little broad reviews have recapitulated the current attempts of Computational Intelligence (CI) in reliability assessment in power systems. There are many methods in reliability assessment with the aim to prolong the life cycles of a system, to maximize profit and predict the life cycle of assets or systems within an organization especially in electric power distribution systems. Sustaining an uninterrupted electrical energy supply is a pointer of affluence and nationwide growth. The general background of reliability assessment in power system distribution using computational intelligence, some computational intelligence techniques, reliability engineering, literature reviews, theoretical or conceptual frameworks, methods of reliability assessment and conclusions was discussed. The anticipated and proposed technique has the aptitude to significantly reduce the needed period for reliability investigation in distribution networks because the distribution network needs an algorithm that can evaluate, assess, measure and update the reliability indices and system performance within a short time. It can also manage outages data on assets and on the entire system for quick and rapid decisions making as well as can prevent catastrophic failures. Those listed above would be taken care of if the proposed method is utilized. This overview or review may be deemed as valuable assistance for anybody doing research.
Go to article

Authors and Affiliations

Elijah Adebayo Olajuyin
1
ORCID: ORCID
Paul Kehinde Olulope
2
Emmanuel Taiwo Fasina
2

  1. Bamidele Olumilua University of Education, Science and Technology, Ikere Ekiti, Nigeria
  2. Ekiti State University, Ado Ekiti, Nigeria
Download PDF Download RIS Download Bibtex

Abstract

The paper presents functionality and operation results of a system for creating dynamic maps of acoustic noise employing the PL-Grid infrastructure extended with a distributed sensor network. The work presented provides a demonstration of the services being prepared within the PLGrid Plus project for measuring, modeling and rendering data related to noise level distribution in city agglomerations. Specific computational environments, the so-called domain grids, are developed in the mentioned project. For particular domain grids, specialized IT solutions are prepared, i.e. software implementation and hardware (infrastructure adaptation), dedicated for particular researcher groups demands, including acoustics (the domain grid “Acoustics”). The infrastructure and the software developed can be utilized mainly for research and education purposes, however it can also help in urban planning. The engineered software is intended for creating maps of noise threat for road, railways and industrial sources. Integration of the software services with the distributed sensor network enables automatic updating noise maps for a specific time period. The unique feature of the developed software is a possibility of evaluating auditory effects which are caused by the exposure to excessive noise. The estimation of auditory effects is based on calculated noise levels in a given exposure period. The outcomes of this research study are presented in a form of the cumulative noise dose and the characteristics of the temporary threshold shift.
Go to article

Authors and Affiliations

Bożena Kostek
Andrzej Czyżewski
Józef Kotus
Maciej Szczodrak
Download PDF Download RIS Download Bibtex

Abstract

Confidential algorithm for the approximate graph vertex covering problem is presented in this article. It can preserve privacy of data at every stage of the computation, which is very important in context of cloud computing. Security of our solution is based on fully homomorphic encryption scheme. The time complexity and the security aspects of considered algorithm are described.
Go to article

Authors and Affiliations

Daniel Waszkiewicz
Aleksandra Horubała
Piotr Sapiecha
Michał Andrzejczak
Download PDF Download RIS Download Bibtex

Abstract

Mobile devices have become an integral part of our life and provide dozens of useful services to their users. However, usability of mobile devices is hindered by battery lifetime. Energy conservation can extend battery lifetime, however, any energy management policy requires accurate prediction of energy consumption, which is impossible without reliable energy measurement and estimation methods and tools. We present an analysis of the energy measurement methodologies and describe the implementations of the internal (profiling) software (proprietary, custom) and external software-based (Java API, Sensor API, GSM AT) energy measurement methodologies. The methods are applied to measure energy consumption on a variety of mobile devices (laptop PC, PDA, smart phone). A case study of measuring energy consumption on a mobile computer using 3DMark06 benchmarking software is presented

Go to article

Authors and Affiliations

Robertas Damaševičius
Vytautas Štuikys
Jevgenijus Toldinas
Download PDF Download RIS Download Bibtex

Abstract

This paper concerns measurement procedures on an emotion monitoring stand designed for tracking human emotions in the Human-Computer Interaction with physiological characteristics. The paper addresses the key problem of physiological measurements being disturbed by a motion typical for human-computer interaction such as keyboard typing or mouse movements. An original experiment is described, that aimed at practical evaluation of measurement procedures performed at the emotion monitoring stand constructed at GUT. Different locations of sensors were considered and evaluated for suitability and measurement precision in the Human- Computer Interaction monitoring. Alternative locations (ear lobes and forearms) for skin conductance, blood volume pulse and temperature sensors were proposed and verified. Alternative locations proved correlation with traditional locations as well as lower sensitiveness to movements like typing or mouse moving, therefore they can make a better solution for monitoring the Human-Computer Interaction.

Go to article

Authors and Affiliations

Agnieszka Landowska
Download PDF Download RIS Download Bibtex

Abstract

Some materials-related microstructural problems calculated using the phase-field method are presented. It is well known that the phase field method requires mesh resolution of a diffuse interface. This makes the use of mesh adaptivity essential especially for fast evolving interfaces and other transient problems. Complex problems in 3D are also computationally challenging so that parallel computations are considered necessary. In this paper, a parallel adaptive finite element scheme is proposed. The scheme keeps the level of node and edge for 2D and level of node and face for 3D instead of the complete history of refinements to facilitate derefinement. The information is local and exchange of information is minimized and also less memory is used. The parallel adaptive algorithms that run on distributed memory machines are implemented in the numerical simulation of dendritic growth and capillary-driven flows.

Go to article

Authors and Affiliations

M. Do-Quang
W. Villanueva
I. Singer-Loginova
G. Amberg
Download PDF Download RIS Download Bibtex

Abstract

The computational intelligence tool has major contribution to analyse the properties of materials without much experimentation. The B4C particles are used to improve the quality of the strength of materials. With respect to the percentage of these particles used in the micro and nano, composites may fix the mechanical properties. The different combinations of input parameters determine the characteristics of raw materials. The load, content of B4C particles with 0%, 2%, 4%, 6%, 8% and 10% will determine the wear behaviour like CoF, wear rate etc. The properties of materials like stress, strain, % of elongation and impact energy are studied. The temperature based CoF and wear rate is analysed. The temperature may vary between 30°C, 100°C and 200°C. In addition, the CoF and wear rate of materials are predicted with respect to load, weight % of B4C and nano hexagonal boron nitride %. The intelligent tools like Neural Networks (BPNN, RBNN, FL and Decision tree) are applied to analyse these characteristics of micro / nano composites with the inclusion of B4C particles and nano hBN % without physically conducting the experiments in the Lab. The material properties will be classified with respect to the range of input parameters using the computational model.

Go to article

Authors and Affiliations

P. Radha
N. Selvakumar
R. Harichandran
Download PDF Download RIS Download Bibtex

Abstract

Parallel computers are becoming more available. The natural way to improve computational efficiency of multibody simulations seems to be parallel processing. Within this work we are trying to estimate the efficiency of parallel computations performed using one of the commercial multibody solver. First, the short theoretical outline is presented to give the overview of modeling issues in multibody dynamics. Next, the experimental part is demonstrated. The series of dynamics analyses are carried out. The test mechanisms with variable number of bodies are used to gather the performance results of the solver. The obtained data allow for estimating the number of bodies which are sufficient to gain benefits from parallel computing as well as their level. The parallel processing profits are taken into account in the case of contact forces present in the system. The performance benefits are indicated, when the multilink belt chain is simulated, in which contact forces are included in the model.

Go to article

Authors and Affiliations

Paweł Malczyk
Janusz Frączek
Download PDF Download RIS Download Bibtex

Abstract

Green spaces are an integral element of urban structures. They are not only a place of rest for their users, but also positively affect their well-being and health. The eff ect of these spaces, is the better, the smoother they create larger urban layout – stings of greenery. The introduction of urban greenery can and should be one of the basic elements of revitalization. Often, however, greenery is designed without multi-aspect analysis, enabling understanding of conditions and the use of existing potential in a given place. The use of computational design in conjunction with the use of generally available databases, such as numerical SRTM terrain models, publicly available OSM map database and EPW meteorological data, allows for the design of space in a more comprehensive way. These design methods allow better matching of the greenery design in a given area to specific architectural, urban and environmental conditions.

Go to article

Authors and Affiliations

Lucyna Nyka
Jan Cudzik
Kacper Radziszewski
Dominik Sędzicki
Download PDF Download RIS Download Bibtex

Abstract

Half a century ago two papers were published, related to generalized inverses of cracovians by two different authors, in chronological order, respectively by Jean Dommanget and by Helmut Moritz. Both independently developed papers demonstrated new theorems, however, certain similarity between them appeared. Helmut Moritz having recognized that situation, promised to mention it later in one of his published papers. This has never been done, so the author of the present paper gives some details about the situation and claims his paternity.
Go to article

Authors and Affiliations

Jean Dommanget
Download PDF Download RIS Download Bibtex

Abstract

In the paper, a noise map service designated for the user interested in environmental noise is presented. Noise prediction algorithm and source model, developed for creating acoustic maps, are working in the cloud computing environment. In the study, issues related to the noise modelling of sound propagation in urban spaces are discussed with a particular focus on traffic noise. Examples of results obtained through a web application created for that purpose are shown. In addition, these are compared to results obtained from the commercial software simulations based on two road noise prediction models. Moreover, the computing performance of the developed application is investigated and analyzed. In the paper, a flowchart simulating the operation of the noise web-based service is presented showing that the created application is easy to use even for people with little experience in computer technology.
Go to article

Authors and Affiliations

Karolina Marciniuk
Bożena Kostek
Maciej Szczodrak
Download PDF Download RIS Download Bibtex

Abstract

The article is devoted to generation techniques of the new public key crypto-systems, which are based on application of indistinguishability obfuscation methods to selected private key crypto-systems. The techniques are applied to symmetric key crypto-system and the target system is asymmetric one. As an input for our approach an implementation of symmetric block cipher with a given private-key is considered. Different obfuscation methods are subjected to processing. The targetsystem would be treated as a public-key for newly created public crypto-system. The approach seems to be interesting from theoretical point of view. Moreover, it can be useful for information protection in a cloud-computing model.
Go to article

Authors and Affiliations

Aleksandra Horubała
Daniel Waszkiewicz
Michał Andrzejczak
Piotr Sapiecha
Download PDF Download RIS Download Bibtex

Abstract

In the age of Information and Communication Technology (ICT), Web and the Internet have changed significantly the way applications are developed, deployed and used. One of recent trends is modern design of web-applications based on SOA. This process is based on the composition of existing web services into a single scenario from the point of view of a particular user or client. This allows IT companies to shorten the product-time to market process. On the other hand, it raises questions about the quality of the application, trade-offs between quality factors and attributes and measurements of these. Services are usually hosted and executed in an environment managed by its provider that assures the quality attributes such as availability or throughput. Therefore, in this paper an attempt has been made to perform quality measurements towards the creation of efficient, dependable and user-oriented Web applications. First, the process of designing service-based applications is described. Next, metrics for subsequent measurements of efficiency, dependability and usability of distributed applications are presented. These metrics will assess the efforts and trade-offs in a Web-based application development. As examples, we describe a pair of multimedia applications which we have developed in our department and executed in a cluster-based environment. One of them runs in the BeesyCluster middleware and the second one in the Kaskada platform. For these applications we present results of measurements and conclude about relations between quality attributes in the presented application development model. This knowledge can be used to reason about such relations for new similar applications and be used in rapid and quality development of the latter.

Go to article

Authors and Affiliations

Paweł Czarnul
Tomasz Dziubich
Hanna Krawczyk
Download PDF Download RIS Download Bibtex

Abstract

Reliable monitoring for detection of damage in epicyclic gearboxes is a serious concern for all industries

in which these gearboxes operate in a harsh environment and in variable operational conditions. In this

paper, autonomous multidimensional novelty detection algorithms are used to estimate the gearbox’ health

state based on vectors of features calculated from the vibration signal. The authors examine various feature

vectors, various sources of data and many different damage scenarios in order to compare novel detection

algorithms based on three different principles of operation: a distance in the feature space, a probability

distribution, and an ANN (artificial neural network)-based model reconstruction approach. In order to compensate

for non-deterministic results of training of neural networks, which may lead to different network

performance, the ensemble technique is used to combine responses from several networks. The methods are

tested in a series of practical experiments involving implanting a damage in industrial epicyclic gearboxes,

and acquisition of data at variable speed conditions.

Go to article

Authors and Affiliations

Ziemowit Dworakowski
Kajetan Dziedziech
Adam Jabłoński
Download PDF Download RIS Download Bibtex

Abstract

The use of elastic bodies within a multibody simulation became more and more important within the last years. To include the elastic bodies, described as a finite element model in multibody simulations, the dimension of the system of ordinary differential equations must be reduced by projection. For this purpose, in this work, the modal reduction method, a component mode synthesis based method and a moment-matching method are used. Due to the always increasing size of the non-reduced systems, the calculation of the projection matrix leads to a large demand of computational resources and cannot be done on usual serial computers with available memory. In this paper, the model reduction software Morembs++ is presented using a parallelization concept based on the message passing interface to satisfy the need of memory and reduce the runtime of the model reduction process. Additionally, the behaviour of the Block-Krylov-Schur eigensolver, implemented in the Anasazi package of the Trilinos project, is analysed with regard to the choice of the size of the Krylov base, the blocksize and the number of blocks. Besides, an iterative solver is considered within the CMS-based method.

Go to article

Authors and Affiliations

Thomas Volzer
Peter Eberhard
Download PDF Download RIS Download Bibtex

Abstract

This work outlines a unified multi-threaded, multi-scale High Performance Computing (HPC) approach for the direct numerical simulation of Fluid-Solid Interaction (FSI) problems. The simulation algorithm relies on the extended Smoothed Particle Hydrodynamics (XSPH) method, which approaches the fluid flow in a Lagrangian framework consistent with the Lagrangian tracking of the solid phase. A general 3D rigid body dynamics and an Absolute Nodal Coordinate Formulation (ANCF) are implemented to model rigid and flexible multibody dynamics. The twoway coupling of the fluid and solid phases is supported through use of Boundary Condition Enforcing (BCE) markers that capture the fluid-solid coupling forces by enforcing a no-slip boundary condition. The solid-solid short range interaction, which has a crucial impact on the small-scale behavior of fluid-solid mixtures, is resolved via a lubrication force model. The collective system states are integrated in time using an explicit, multi-rate scheme. To alleviate the heavy computational load, the overall algorithm leverages parallel computing on Graphics Processing Unit (GPU) cards. Performance and scaling analysis are provided for simulations scenarios involving one or multiple phases with up to tens of thousands of solid objects. The software implementation of the approach, called Chrono::Fluid, is part of the Chrono project and available as an open-source software.

Go to article

Authors and Affiliations

Arman Pazouki
Radu Serban
Dan Negrut
Download PDF Download RIS Download Bibtex

Abstract

Computational modeling plays an important role in the methodology of contemporary science. The epistemological role of modeling and simulations leads to questions about a possible use of this method in philosophy. Attempts to use some mathematical tools to formulate philosophical concepts trace back to Spinoza and Newton. Newtonian natural philosophy became an example of successful use of mathematical thinking to describe the fundamental level of nature. Newton’s approach has initiated a new scientific field of research in physics and at the same time his system has become a source of new philosophical considerations about physical reality. According to Michael Heller, some physical theories may be treated as the formalizations of philosophical conceptions. Computational modeling may be an extension of this idea; this is what I would like to present in the article. I also consider computational modeling in philosophy as a source of new philosophical metaphors; this idea has been proposed in David J. Bolter’s conception of defining technology. The consideration leads to the following conclusion: In the methodology of philosophy significant changes have been taking place; the new approach do not make traditional methods obsolete, it is rather a new analytical tools for philosophy and a source of inspiring metaphors.

Go to article

Authors and Affiliations

Paweł Polak

Authors and Affiliations

Piotr Karwat
1

  1. Department of Ultrasound, Institute of Fundamental Technological Research, Polish Academy of Sciences, Warsaw, Poland
Download PDF Download RIS Download Bibtex

Abstract

Disk motors are characterized by the axial direction of main magnetic flux and the variable length of the magnetic flux path along varying stator/rotor radii. This is why it is generally accepted that reliable electromagnetic calculations for such machines should be carried out using the FEM for 3D models. The 3D approach makes it possible to take into account an entire spectrum of different effects. Such computational analysis is very time-consuming, this is in particular true for machines with one magnetic axis only. An alternate computational method based on a 2D FEM model of a cylindrical motor is proposed in the paper. The obtained calculation results have been verified by means of lab test results for a physical model. The proposed method leads to a significant decrease of computational time, i.e. the decrease of iterative search for the most advantageous design.

Go to article

Authors and Affiliations

Tomasz Wolnik
ORCID: ORCID

This page uses 'cookies'. Learn more