Endre søk
Begrens søket
12 1 - 50 of 93
RefereraExporteraLink til resultatlisten
Permanent link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Treff pr side
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sortering
  • Standard (Relevans)
  • Forfatter A-Ø
  • Forfatter Ø-A
  • Tittel A-Ø
  • Tittel Ø-A
  • Type publikasjon A-Ø
  • Type publikasjon Ø-A
  • Eldste først
  • Nyeste først
  • Skapad (Eldste først)
  • Skapad (Nyeste først)
  • Senast uppdaterad (Eldste først)
  • Senast uppdaterad (Nyeste først)
  • Disputationsdatum (tidligste først)
  • Disputationsdatum (siste først)
  • Standard (Relevans)
  • Forfatter A-Ø
  • Forfatter Ø-A
  • Tittel A-Ø
  • Tittel Ø-A
  • Type publikasjon A-Ø
  • Type publikasjon Ø-A
  • Eldste først
  • Nyeste først
  • Skapad (Eldste først)
  • Skapad (Nyeste først)
  • Senast uppdaterad (Eldste først)
  • Senast uppdaterad (Nyeste først)
  • Disputationsdatum (tidligste først)
  • Disputationsdatum (siste først)
Merk
Maxantalet träffar du kan exportera från sökgränssnittet är 250. Vid större uttag använd dig av utsökningar.
  • 1. Abdullah, Omed Gh.
    et al.
    Tahir, Dana A.
    Kadir, K.
    Stockholms universitet, Naturvetenskapliga fakulteten, Institutionen för material- och miljökemi (MMK). Kurdistan Institution for Strategic Studies and Scientific Research, Iraq.
    Optical and structural investigation of synthesized PVA/PbS nanocomposites2015Inngår i: Journal of materials science. Materials in electronics, ISSN 0957-4522, E-ISSN 1573-482X, Vol. 26, nr 9, s. 6939-6944Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Polymer nanocomposite based on polyvinyl alcohol (PVA) and lead sulfide (PbS) in the average radius of (1.88-2.23) nm, have been synthesized using the chemical reduction rote and solution casting technique for different concentrations of PbS. The characterization of the polymer nanocomposite films were carried out using UV-visible spectroscopy, SEM, and XRD. The effect of various concentration of PbS NP on the optical properties of the composite has been studied to understand the optimum conditions for the synthesis process. The nanocomposite film shows high UV and visible light absorptions in the wavelength range of (200-500) nm, which correspond to the characteristics of the PbS NPs. The significant decreasing trend of the direct allowed band gap of the nanocomposite was observed upon increasing the Pb source concentration, from (6.27 eV) for pure PVA to (2.34 eV) for 0.04 M PbS concentration, which is much higher than the energy gap of bulk PbS value (0.41 eV). The calculated values of the static refractive index of Cauchy dispersion model were in the range of (1.09-1.20). X-ray diffraction analysis confirmed the cubic nanocrystalline PbS phase formation.

  • 2.
    Al Sabbagh, Bilal
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Cybersecurity Incident Response: A Socio-Technical Approach2019Doktoravhandling, med artikler (Annet vitenskapelig)
    Abstract [en]

    This thesis examines the cybersecurity incident response problem using a socio-technical approach. The motivation of this work is the need to bridge the knowledge and practise gap that exists because of the increasing complexity of cybersecurity threats and our limited capability of applying cybersecurity controls necessary to adequately respond to these threats. Throughout this thesis, knowledge from Systems Theory, Soft Systems Methodology and Socio-Technical Systems is applied to examine and document the socio-technical properties of cybersecurity incident response process. The holistic modelling of cybersecurity incident response process developed concepts and methods tested to improve the socio-technical security controls and minimise the existing gap in security controls.

    The scientific enquiry of this thesis is based on pragmatism as the underpinning research philosophy.  The thesis uses a design science research approach and embeds multiple research methods to develop five artefacts (concept, model, method, framework and instantiation) outlined in nine peer-reviewed publications. The instantiated artefact embraces the knowledge developed during this research to provide a prototype for a socio-technical security information and event management system (ST-SIEM) integrated with an open source SIEM tool. The artefact relevance was validated through a panel of cybersecurity experts using a Delphi method. The Delphi method indicated the artefact can improve the efficacy of handling cybersecurity incidents.

  • 3. Barbeiro, A. R.
    et al.
    Ureba, Ana
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum. Universidad de Sevilla, Spain; Instituto de Biomedicina de Sevilla, IBIS, Spain; Karolinska Institutet, Sweden.
    Baeza, J. A.
    Linares, R.
    Perucha, M.
    Jimenez-Ortega, E.
    Velazquez, S.
    Mateos, J. C.
    Leal, A.
    3D VMAT Verification Based on Monte Carlo Log File Simulation with Experimental Feedback from Film Dosimetry2016Inngår i: PLoS ONE, ISSN 1932-6203, E-ISSN 1932-6203, Vol. 11, nr 11, artikkel-id e0166767Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    A model based on a specific phantom, called QuAArC, has been designed for the evaluation of planning and verification systems of complex radiotherapy treatments, such as volumetric modulated arc therapy (VMAT). This model uses the high accuracy provided by the Monte Carlo (MC) simulation of log files and allows the experimental feedback from the high spatial resolution of films hosted in QuAArC. This cylindrical phantom was specifically designed to host films rolled at different radial distances able to take into account the entrance fluence and the 3D dose distribution. Ionization chamber measurements are also included in the feedback process for absolute dose considerations. In this way, automated MC simulation of treatment log files is implemented to calculate the actual delivery geometries, while the monitor units are experimentally adjusted to reconstruct the dose-volume histogram (DVH) on the patient CT. Prostate and head and neck clinical cases, previously planned with Monaco and Pinnacle treatment planning systems and verified with two different commercial systems (Delta4 and COMPASS), were selected in order to test operational feasibility of the proposed model. The proper operation of the feedback procedure was proved through the achieved high agreement between reconstructed dose distributions and the film measurements (global gamma passing rates > 90% for the 2%/2 mm criteria). The necessary discretization level of the log file for dose calculation and the potential mismatching between calculated control points and detection grid in the verification process were discussed. Besides the effect of dose calculation accuracy of the analytic algorithm implemented in treatment planning systems for a dynamic technique, it was discussed the importance of the detection density level and its location in VMAT specific phantom to obtain a more reliable DVH in the patient CT. The proposed model also showed enough robustness and efficiency to be considered as a pre-treatment VMAT verification system.

  • 4. Benjaminsson, Simon
    et al.
    Lansner, Anders
    Stockholms universitet, Naturvetenskapliga fakulteten, Numerisk analys och datalogi (NADA). Royal Institute of Technology, Sweden.
    Nexa: A scalable neural simulator with integrated analysis2012Inngår i: Network: Computation in Neural Systems, ISSN 0954-898X, Vol. 23, nr 4, s. 254-271Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Large-scale neural simulations encompass challenges in simulator design, data handling and understanding of simulation output. As the computational power of supercomputers and the size of network models increase, these challenges become even more pronounced. Here we introduce the experimental scalable neural simulator Nexa, for parallel simulation of large-scale neural network models at a high level of biological abstraction and for exploration of the simulation methods involved. It includes firing-rate models and capabilities to build networks using machine learning inspired methods for e.g. self-organization of network architecture and for structural plasticity. We show scalability up to the size of the largest machines currently available for a number of model scenarios. We further demonstrate simulator integration with online analysis and real-time visualization as scalable solutions for the data handling challenges.

  • 5. Bertels, Koen
    et al.
    Jacques, Jean-Marie
    Boman, Magnus
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Risk and crises management in complex systems2005Inngår i: Micro, meso, macro: addressing complex systems couplings / [ed] Hans Liljenstörm, Uno Svedin, New Jersey: World Scientific, 2005, s. 305-316Kapittel i bok, del av antologi (Annet vitenskapelig)
  • 6. Biteus, Jonas
    et al.
    Lindgren, Tony
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Planning Flexible Maintenance for Heavy Trucks using Machine Learning Models, Constraint Programming, and Route Optimization2017Inngår i: SAE International Journal of Materials & Manufacturing, ISSN 1946-3979, E-ISSN 1946-3987, Vol. 10, nr 3, s. 306-315Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Maintenance planning of trucks at Scania have previously been done using static cyclic plans with fixed sets of maintenance tasks, determined by mileage, calendar time, and some data driven physical models. Flexible maintenance have improved the maintenance program with the addition of general data driven expert rules and the ability to move sub-sets of maintenance tasks between maintenance occasions. Meanwhile, successful modelling with machine learning on big data, automatic planning using constraint programming, and route optimization are hinting on the ability to achieve even higher fleet utilization by further improvements of the flexible maintenance. The maintenance program have therefore been partitioned into its smallest parts and formulated as individual constraint rules. The overall goal is to maximize the utilization of a fleet, i.e. maximize the ability to perform transport assignments, with respect to maintenance. A sub-goal is to minimize costs for vehicle break downs and the costs for maintenance actions. The maintenance planner takes as input customer preferences and maintenance task deadlines where the existing expert rule for the component has been replaced by a predictive model. Using machine learning, operational data have been used to train a predictive random forest model that can estimate the probability that a vehicle will have a breakdown given its operational data as input. The route optimization takes predicted vehicle health into consideration when optimizing routes and assignment allocations. The random forest model satisfactory predicts failures, the maintenance planner successfully computes consistent and good maintenance plans, and the route optimizer give optimal routes within tens of seconds of operation time. The model, the maintenance planner, and the route optimizer have been integrated into a demonstrator able to highlight the usability and feasibility of the suggested approach.

  • 7.
    Bodnar, Taras
    et al.
    Stockholms universitet, Naturvetenskapliga fakulteten, Matematiska institutionen.
    Dmytriv, Solomiia
    Parolya, Nestor
    Schmid, Wolfgang
    Tests for the Weights of the Global Minimum Variance Portfolio in a High-Dimensional Setting2019Inngår i: IEEE Transactions on Signal Processing, ISSN 1053-587X, E-ISSN 1941-0476, Vol. 67, nr 17, s. 4479-4493Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    In this paper, we construct two tests for the weights of the global minimum variance portfolio (GMVP) in a high-dimensional setting, namely, when the number of assets p depends on the sample size n such that p/n -> c is an element of (0, 1) as n tends to infinity. In the case of a singular covariance matrix with rank equal to q we assume that q/n -> <(c)over tilde is an element of (0,1) as n -> infinity. The considered tests are based on the sample estimator and on the shrinkage estimator of the GMVP weights. We derive the asymptotic distributions of the test statistics under the null and alternative hypotheses. Moreover, we provide a simulation study where the power functions and the receiver operating characteristic curves of the proposed tests are compared with other existing approaches. We observe that the test based on the shrinkage estimator performs well even for values of c close to one.

  • 8.
    Brodin, Jane
    Stockholms universitet, Lärarhögskolan i Stockholm (LHS).
    Communication and assistive technology for persons with mental retardation1997Inngår i: Advancement of assistive technology / [ed] G. Anogianakis, C. Bühler, M. Soede, Amsterdam: IOS Press, 1997, s. 81-84Kapittel i bok, del av antologi (Annet vitenskapelig)
  • 9.
    Brown, Barry
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    The Social Life of Autonomous Cars2017Inngår i: Computer, ISSN 0018-9162, E-ISSN 1558-0814, Vol. 50, nr 2, s. 92-96Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Until the day comes when all vehicles are fully autonomous, self-driving cars must be more than safe and efficient, they must also understand and interact naturally with human drivers. The web extras include videos demonstrating "rude" behavior by Tesla's Autopilot system, www.youtube.com/watch?v=el4OdwtgzNk; a human driver confused by self-driving technology, www.youtube.com/watch?v=Uj-rK8V-rik; and aggressive driving prompted by self-driving technology, www.youtube.com/watch?v=FbSQm3YaAzA.

  • 10. Cardoso, G.
    et al.
    Stadler, M.
    Siddiqui, Afzal
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap. University College London, UK.
    Marnay, C.
    DeForest, N.
    Barbosa-Povoa, A.
    Ferrao, P.
    Microgrid reliability modeling and battery scheduling using stochastic linear programming2013Inngår i: Electric power systems research, ISSN 0378-7796, E-ISSN 1873-2046, Vol. 103, s. 61-69Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    This paper describes the introduction of stochastic linear programming into Operations DER-CAM, a tool used to obtain optimal operating schedules for a given microgrid under local economic and environmental conditions. This application follows previous work on optimal scheduling of a lithium-iron-phosphate battery given the output uncertainty of a 1 MW molten carbonate fuel cell. Both are in the Santa Rita Jail microgrid, located in Dublin, California. This fuel cell has proven unreliable, partially justifying the consideration of storage options. Several stochastic DER-CAM runs are executed to compare different scenarios to values obtained by a deterministic approach. Results indicate that using a stochastic approach provides a conservative yet more lucrative battery schedule. Lower expected energy bills result, given fuel cell outages, in potential savings exceeding 6%.

  • 11. Chen, Shuzhen
    et al.
    Wu, Desheng
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Företagsekonomiska institutionen. University of Chinese Academy of Sciences, China.
    Connectivity, Netting, and Systemic Risk of Payment Systems2019Inngår i: IEEE Systems Journal, ISSN 1932-8184, E-ISSN 1937-9234, Vol. 13, nr 2, s. 1658-1668Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    The stability of payment system is of vital importance to the credit market as well as the economic development. Most researches focus on the effect of system connectivity on systemic risk and demonstrate that connectivity provides both risk-spread channel and risk-sharing mechanism. But the management of systemic risk is quite different in real-time gross settlement system and net settlement system. We provide an integrated analysis of the effect of connectivity and netting on systemic risk in payment systems by considering more detailed network structures of pure creditors and pure debtors. We show that the effect of netting is partly due to the change of network connectivity, which severs the contagion channel of shocks. Moreover, netting can lower the actual magnitude of the shock from the beginning by reducing source bank's debt.

  • 12. Conti, Maurizio
    et al.
    Eriksson, Lars
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    Westerwoudt, Victor
    Estimating Image Quality for Future Generations of TOF PET Scanners2013Inngår i: IEEE Transactions on Nuclear Science, ISSN 0018-9499, E-ISSN 1558-1578, Vol. 60, nr 1, s. 87-94Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Images taken with time-of-flight (TOF) positron emission tomography (PET) scanners are of improved quality compared to equivalent non-TOF images. This improvement is dependent on the scanner time resolution. The present generation of commercial TOF scanners has a time resolution in the range of 500-600 ps full width half maximum. In this work we investigate how the image characteristics will improve for future generations of TOF PET. We performed a Geant4 simulation of a 30-cm uniform cylinder containing hot spheres, with time resolution ranging from 600 to 200 ps. Data were reconstructed using TOF filtered back projection (FBP) and TOF ordered subsets expectation maximization (OSEM), with nonTOF reconstruction as a reference. Images were compared in terms of contrast recovery and variance in the image. The TOF gain was evaluated for both reconstruction methods. The TOF gain was also evaluated vs. counts in the scan, in order to understand the behavior of such gain at very low statistics. Using TOF FBP, it was shown that the TOF gain can be used as a sensitivity amplifier, reducing (according to the expected TOF gain) the number of counts necessary to produce an image of the same characteristics. Some limitations in the TOF gain were observed at very low counts, particularly if using iterative methods.

  • 13.
    Cunningham, Miriam
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Technology-Enhanced Learning in Kenyan Universities2016Inngår i: IEEE technology & society magazine, ISSN 0278-0097, E-ISSN 1937-416X, Vol. 35, nr 3, s. 28-35Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    This article discusses some of the findings of a study that bridges an existing knowledge gap by focusing on identifying influences on the wider adoption and uptake of TEL techniques by HEIs in Nairobi. In this context, TEL techniques can encompass e-learning, blended learning, using massive open online courses (MOOCs), or an entirely online course delivery. This study examines why HEIs are using TEL, perceived benefits and challenges of using TEL from an institutional and instructor perspective, and the impact of policies. The findings have important research, practical, societal, and policy making implications for educational delivery on a continent with a rapidly growing population. Findings will assist decision making, inform policy creation, and provide useful foundational reference material for further comparative research in Africa. The lessons learned will also assist tertiary level institutions across the African continent that wish to plan for wider TEL adoption, or to implement TEL in a more effective manner, by considering common challenges that could limit adoption.

  • 14. Dang, Khue-Dung
    et al.
    Quiroz, Matias
    Kohn, Robert
    Minh-Ngoc, Tran
    Villani, Mattias
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Statistiska institutionen. Linköping University, Sweden; ARC Centre of Excellence for Mathematical and Statistical Frontiers (ACEMS), Australia.
    Hamiltonian Monte Carlo with Energy Conserving Subsampling2019Inngår i: Journal of machine learning research, ISSN 1532-4435, E-ISSN 1533-7928, Vol. 20, s. 1-31, artikkel-id 100Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Hamiltonian Monte Carlo (HMC) samples efficiently from high-dimensional posterior distributions with proposed parameter draws obtained by iterating on a discretized version of the Hamiltonian dynamics. The iterations make HMC computationally costly, especially in problems with large data sets, since it is necessary to compute posterior densities and their derivatives with respect to the parameters. Naively computing the Hamiltonian dynamics on a subset of the data causes HMC to lose its key ability to generate distant parameter proposals with high acceptance probability. The key insight in our article is that efficient subsampling HMC for the parameters is possible if both the dynamics and the acceptance probability are computed from the same data subsample in each complete HMC iteration. We show that this is possible to do in a principled way in a HMC-within-Gibbs framework where the subsample is updated using a pseudo marginal MH step and the parameters are then updated using an HMC step, based on the current subsample. We show that our subsampling methods are fast and compare favorably to two popular sampling algorithms that use gradient estimates from data subsampling. We also explore the current limitations of subsampling HMC algorithms by varying the quality of the variance reducing control variates used in the estimators of the posterior density and its gradients.

  • 15.
    Elly Amani, Gamukama
    et al.
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Larsson, Aron
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Popov, Oliver
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Mugisha, Joseph Y. T.
    Group Decision Evaluation of Internet Services in the Context of DevelopmentManuskript (preprint) (Annet vitenskapelig)
    Abstract [en]

    The paper presents group decision assessment for the Internet services in the context of development (ISCD). The assessment is achieved through the use of a decision model whose fundamental goal is to provide a systematic approach for addressing the problem of misalignments among the Internet stakeholders’ objectives. The modelling of the problem is approached from the perspectives of delivering/receiving the Internet services that maximizes the respective stakeholders’ objectives. Based on the AHP theory, it structures the problem into four hierarchies with three aspects of consideration as (a) services relevance in context of development, (b) services delivery mechanism convergence to IP Infrastructure and (c) services commensurability to traffic classes’ requirements. An assessment of the aggregated individually derived final priorities (AIP) reveals that for aligning the stakeholders’ objectives at local level, end users should first strive to implement the Internet components/applications that can cause high impact to their transactions/business, followed by those services/applications that can “empower” them to fulfil their goals. While at global level, the affordability of recurring subscriptions for Internet access, end user terminal equipment cost, and coverage rage/penetration are the key issues that the policy makers should address in view of achieving the ISCD objectives. Finally the paper includes strategic options for the best course of action in aligning the stakeholders’ objectives.

  • 16.
    Eriksson, Lars
    et al.
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    Conti, M.
    Melcher, C. L.
    Zhuravleva, M.
    Eriksson, M.
    Rothfuss, H.
    LuYAP/LSO Phoswich Detectors for High Resolution Positron Emission Tomography2013Inngår i: IEEE Transactions on Nuclear Science, ISSN 0018-9499, E-ISSN 1558-1578, Vol. 60, nr 1, s. 194-196Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    The spatial resolution in positron emission tomography (PET) can be improved by the addition of depth-of-interaction (DOI) information. This can be achieved by using the phoswich approach in which depth identification relies on differences in scintillation decay time and pulse shape discrimination techniques. In this paper we have looked at a special phoswich combination LuAP/LSO or LuYAP/LSO. This combination of scintillators is especially interesting since LuAP and LuYAP have emission in the excitation band of LSO, which may have an impact on the timing resolution of the detector. As will be shown in this paper, the phoswich concept based on these two scintillators can be utilized, however, with some limitations. This paper is an extension of our previous phoswich investigation [3].

  • 17.
    Everitt, Tom
    et al.
    Stockholms universitet, Naturvetenskapliga fakulteten, Matematiska institutionen.
    Lattimore, Tor
    Hutter, Marcus
    Free Lunch for Optimisation under the Universal Distribution2014Inngår i: 2014 IEEE Congress on Evolutionary Computation (CEC), New York: IEEE Computer Society, 2014, s. 167-174Konferansepaper (Fagfellevurdert)
    Abstract [en]

    Function optimisation is a major challenge in computer science. The No Free Lunch theorems state that if all functions with the same histogram are assumed to be equally probable then no algorithm outperforms any other in expectation. We argue against the uniform assumption and suggest a universal prior exists for which there is a free lunch, but where no particular class of functions is favoured over another. We also prove upper and lower boundson the size of the free lunch.

  • 18. Forssen, Jens
    et al.
    Mauriz, Laura Estevez
    Torehammar, Clas
    Jean, Philippe
    Axelsson, Östen
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Psykologiska institutionen.
    Performance of a Low-Height Acoustic Screen for Urban Roads: Field Measurement and Numerical Study2019Inngår i: Acta Acoustica united with Acustica, ISSN 1610-1928, E-ISSN 1861-9959, Vol. 105, nr 6, s. 1026-1034Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Field measurements and numerical modelling were used to study the acoustic performance of a low screen in an urban road setting. The results show the usefulness of low screens as well as suggests improvements in screen design. For the measurements, an acoustic screen built up from concrete modules was temporarily installed beside a small park on the reservation between a two-lane road and a track for walking and cycling. A larger traffic system, of which the two-lane road is a part, determines the daytime equivalent noise level within the urban area. The screen height was about 1.4m as measured from the level of the road surface and the width of the screen top was 0.3 m. Measurements were carried out both at 20 m distance from the road (within the park) and at 5 m distance from the road (at the cycle track). Insertion loss in maximum level, using controlled light-vehicle pass-by at 50 km/h, was measured to 10 dB at 5 m distance and to 6 dB at 20 m distance, at 1.5 m height. Insertion loss in equivalent level was measured within the park to 4 dB at 1.5 m height. A listening experiment confirmed a perceived improvement from installing the screen. The measured results were also compared with predicted results using a boundary element method (BEM) and a noise mapping software, the latter showing good agreement, overestimating the equivalent level insertion loss by 1 dB in the park. The BEM comparison showed reasonable agreement in maximum level insertion loss considering that facade reflections were excluded, with an overestimation of 5 dB at the cycle track, and good agreement in the park, overestimating by up to 1 dB the equivalent and maximum level insertion losses. BEM predictions were used to also investigate other screen designs, showing a positive effect of an acoustically soft screen top, significant for a screen width of 0.2 m and increasing for wider screens.

  • 19.
    Gamukama, Elly A.
    et al.
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Popov, Oliver B.
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    A Social Welfare Approach in Increasing the Benefits from the Internet in Developing Countries2011Inngår i: International Journal on Network Security, ISSN 2152-5064, Vol. 2, nr 4, s. 29-33Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    The paper examines the Internet usage and itsmarket environment in developing countries under theperceived assumption that the Internet is one of the mostimportant drivers for development. It gives an insight onprocesses’ (both unintended and intended) implications andtheir effects on achieving real Internet benefits in theenvironments where network infrastructures are limited suchas the ones found in the developing regions. A welfare basedapproach is proposed in which the Internet providers and endusersidentify a set of objective that leads them in achievingincreased benefits. Analytical model of the maincharacteristics in the approach is presented and eventuallyshown how the end user bit rate could be regulated based onthe utility bounds that lead general satisfaction to all users.User satisfaction signifies delivery of expected QoS and aswell as willing to pay for such services.

  • 20.
    Gamukama, Elly Amani
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Analytical modelling of Internet provision and usage in context of development through a utility based framework2015Doktoravhandling, med artikler (Annet vitenskapelig)
    Abstract [en]

    Information and communication technology and development (ICTD) is a research area that has broadly captured the attention of the public and the academics in the last two decades. It deals with the interaction and the relations between the humans and the society in general on one side and the technology on the other side. The focus in this thesis is on the computing and communication technology, herein referred to as the “Internet – the IP based technology”, which is seen as one of the enablers for economic and social growth.

    The benefit of the Internet connectivity and usage in inducing and enhancing positive social changes in basic dimensions of human life is generally accepted as one of the most important drivers for development. The success and the inevitability of the Internet in the developed world underline its proliferation and diffusion essential in less developed countries. However, sometimes these processes are being impaired by unintended and intended consequences created by the social dynamics that drives the current information technological innovations and evolutions, stakeholders’ desire of fulfilling one’s utility egos, all coupled with market environments.

    This thesis takes an insight in both unintended and intended implications and their effects on enabling development in the environments where the Internet Protocol (IP) based infrastructures are limited like in Least Developing regions/countries.  The results of this insight study have led in;

    a)      Establishing the basic Internet services that would trigger the exploitation of one’s potential for development.

    This has been achieved through the use of analytical scientific methods to classify Internet traffic characteristics and derive the relevance levels of their corresponding Internet services groups in fostering development.

    b)      Developing a framework that lays down structure guidelines to facilitate Decision Makers especially in least developed countries to make scientifically informed subjective judgements for Internet services in the context of development.

    c)      Designing and developing of the Internet Services in the Context of Developing (ISCD) model that enables the alignment of the apparent divergent/misalignment objectives of Internet stakeholders in the present Internet structure to have their respective maximised intended benefits.

    Empirical testing of the model led in setting strategic options for aligning stakeholders goals in view of the ISCD along two main domains (i) network management policies – that focuses on provisions of services, and (ii) Internet consumption/usage – that focus on services relevance, commensurability to specific requirements as pertains LDCs, and services delivery mechanism convergence to all-IP.

  • 21.
    Gamukama, Elly Amani
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Fairness on the Internet and its Importance in Development Context2008Konferansepaper (Fagfellevurdert)
    Abstract [en]

    The notion of fairness with respect to resource sharing among the competing flows is one of the important considerations in network design. This is true in particular for IP networks (which are the foundation of the Internet), where the service model is based on best effort and any possible distortion of it may lead to flow starvation and eventually system imbalances. In fact, fairness should be one of the major objectives both on a network layer and transport layer. This is evident in the case of elastic flows such as TCP, where fairness may have a major impact on congestion resolution. On a network layer, fairness mechanisms combined with scheduling and queuing policies lead to equitable service, which may also induce higher router utilization and hence better network performance. The paper investigates the current trends in understanding and applying the fairness concept on the Internet and hence in heterogeneous networks. Then it studies and examines the extension of the fairness concept in the context of development and developing regions, where both the traditional lack of infrastructure and costly communication services have also affected the penetration of the Internet and more even distribution of its benefits. The key question is whether or not it is plausible to identify a framework for the evaluation of efficiently-fairness tradeoffs that may provide a sound basis for a model of a more equitable access to the Internet to a diversity of users with different needs and financial possibilities representing mainly developing regions and emerging economies.

  • 22.
    Gamukama, Elly Amani
    et al.
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Larsson, Aron
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Popov, Oliver
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Mugisha, J. Y. T.
    The Decision Model for the Internet Services in the Context of Development2015Inngår i: Procedia Computer Science, ISSN 1877-0509, E-ISSN 1877-0509, Vol. 55, s. 622-631Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    The Internet Services in the Context of Development (ISCD) model is structured in four levels of hierarchy based on the Analytical Hierarchy Processes (AHP) theory. The model provides a formal approach of establishing the relative importance of Internet services in the context of fostering national development. This paper presents the fundamental conceptsof themodel. Pairwise Comparisons (PCs) technique the cornerstone of the AHP theory is used as the baseline technique for measuring the intensity of preference between the Internet traffic classes (therein their respective services they deliver to end users) in the process of formulating the judgment matrix. The ISCD model is modelled to process data obtained from a group of individual decision makers that are independent from each other. Hence decision makers are weighted in the process of aggregating their priority vectors and the normalized weighted geometric mean method (NWGMM) is used to compute the group's priority vector, which is the final output of the model.

  • 23.
    Gamukama, Elly Amani
    et al.
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Popov, Oliver
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    The Level of Scientific Methods Use in Computing Research Programs2008Konferansepaper (Fagfellevurdert)
    Abstract [en]

    The research investigates the level to whichscientists’ use scientific methods in computing researchprograms. Data was collected from a representative sample ofresearchers in the field. The findings show that the presentresearch programs are more driven by the market forces.Innovations come up as a consequence of satisfying themarket calls but not necessarily a result of advancement inbasic science. Researchers’ investigations are driven by threecharacteristics; proof of performance, concept and existence.Also noted from the study, some researchers lack a cleardistinction between the methods. They tend to mix methods intheir research programs as longer as the industry acceptstheir outcome artifact. Consequently, there is lack of a clearcurriculum to instill such methodological concepts at graduatelevel in some of the computing schools.

  • 24.
    Giannoulis, Constantinos
    et al.
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Kabilan, Vandana
    A Method for VVA Tailoring: The REVVA Generic Process Tailoring Case Study2007Konferansepaper (Fagfellevurdert)
  • 25.
    Giannoulis, Constantinos
    et al.
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Svee, Eric-Oluf
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Zdravkovic, Jelena
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Capturing Consumer Preference in System Requirements Through Business Strategy2013Inngår i: International Journal of Information System Modeling and Design, ISSN 1947-8186, E-ISSN 1947-8194, Vol. 4, nr 4, s. 1-26Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    A core concern within Business-IT alignment is coordinating strategic initiatives and plans with Information Systems (IS). Substantial work has been done on linking strategy to requirements for IS development, but it has usually been focused on the core value exchanges offered by the business, and thus overlooking other aspects that influence the implementation of strategy. One of these, consumer preferences, has been proven to influence the successful provisioning of the business's customer value proposition, and this study aims to establish a conceptual link between both strategy and consumer preferences to system requirements. The core contention is that reflecting consumer preferences through business strategy in system requirements allows for the development of aligned systems, and therefore systems that better support a consumer orientation. The contribution of this paper is an approach to establish such alignment, with this being accomplished through the proposal of a consumer preference meta-model mapped to a business strategy meta-model further linked to a system requirements technique. The validity of this proposal is demonstrated through a case study carried out within an institution of higher education in Sweden.

  • 26.
    Giannoulis, Constantinos
    et al.
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Zdravkovic, Jelena
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    A Design Science Perspective on Business Strategy Modeling2014Inngår i: Enterprise, Business-Process and Information Systems Modeling: 15th International Conference, BPMDS 2014, 19th International Conference, EMMSAD 2014, Held at CAiSE 2014, Thessaloniki, Greece, June 16-17, 2014. Proceedings / [ed] Ilia Bider, Khaled Gaaloul, John Krogstie, Selmin Nurcan, Henderik A. Proper, Rainer Schmidt, Pnina Soffer, Springer Berlin/Heidelberg, 2014, s. 424-438Konferansepaper (Fagfellevurdert)
    Abstract [en]

    An important topic in the modeling for IS development concerns quality of obtained models, especially when these models are to be used in global scopes, or as references. So far, a number of model quality frameworks have been established to assess relevant criteria such as completeness, clarity, modularity, or generality. In this study we take a look at how a research process contributes to the characteristics of a model produced during that process. For example: what should be observed; what research methods should be selected and how should they be applied; what kind of results should be expected; how they should be evaluated, etc. We report a result on this concern by presenting how we applied Design Science Research to model business strategy.

  • 27.
    Golod, Taras
    et al.
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    Iovan, Adrian
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    Krasnov, Vladimir M.
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    Single Abrikosov vortices as quantized information bits2015Inngår i: Nature Communications, ISSN 2041-1723, E-ISSN 2041-1723, Vol. 6, artikkel-id 8628Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Superconducting digital devices can be advantageously used in future supercomputers because they can greatly reduce the dissipation power and increase the speed of operation. Non-volatile quantized states are ideal for the realization of classical Boolean logics. A quantized Abrikosov vortex represents the most compact magnetic object in superconductors, which can be utilized for creation of high-density digital cryoelectronics. In this work we provide a proof of concept for Abrikosov-vortex-based random access memory cell, in which a single vortex is used as an information bit. We demonstrate high-endurance write operation and two different ways of read-out using a spin valve or a Josephson junction. These memory cells are characterized by an infinite magnetoresistance between 0 and 1 states, a short access time, a scalability to nm sizes and an extremely low write energy. Non-volatility and perfect reproducibility are inherent for such a device due to the quantized nature of the vortex.

  • 28. Golov, Nikolay
    et al.
    Rönnbäck, Lars
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Big Data normalization for massively parallel processing databases2017Inngår i: Computer Standards & Interfaces, ISSN 0920-5489, E-ISSN 1872-7018, Vol. 54, nr 2, s. 86-93Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    High performance querying and ad-hoc querying are commonly viewed as mutually exclusive goals in massively parallel processing databases. Furthermore, there is a contradiction between ease of extending the data model and ease of analysis. The modern 'Data Lake' approach, promises extreme ease of adding new data to a data model, however it is prone to eventually becoming a Data Swamp - unstructured, ungoverned, and out of control Data Lake where due to a lack of process, standards and governance, data is hard to find, hard to use and is consumed out of context. This paper introduces a novel technique, highly normalized Big Data using Anchor modeling, that provides a very efficient way to store information and utilize resources, thereby providing ad-hoc querying with high performance for the first time in massively parallel processing databases. This technique is almost as convenient for expanding data model as a Data Lake, while it is internally protected from transforming to Data Swamp. A case study of how this approach is used for a Data Warehouse at Avito over a three-year period, with estimates for and results of real data experiments carried out in HP Vertica, an MPP RDBMS, is also presented. This paper is an extension of theses from The 34th International Conference on Conceptual Modeling (ER 2015) (Golov and Ronnback 2015) [1], it is complemented with numerical results about key operating areas of highly normalized big data warehouse, collected over several (1-3) years of commercial operation. Also, the limitations, imposed by using a single MPP database cluster, are described, and cluster fragmentation approach is proposed.

  • 29.
    Gurung, Ram B.
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Adapted Random Survival Forest for Histograms to Analyze NOx Sensor Failure in Heavy Trucks2019Inngår i: Machine Learning, Optimization, and Data Science: Proceedings / [ed] Giuseppe Nicosia, Prof. Panos Pardalos, Renato Umeton, Prof. Giovanni Giuffrida, Vincenzo Sciacca, Springer, 2019, s. 83-94Konferansepaper (Fagfellevurdert)
    Abstract [en]

    In heavy duty trucks operation, important components need to be examined regularly so that any unexpected breakdowns can be prevented. Data-driven failure prediction models can be built using operational data from a large fleet of trucks. Machine learning methods such as Random Survival Forest (RSF) can be used to generate a survival model that can predict the survival probabilities of a particular component over time. Operational data from the trucks usually have many feature variables represented as histograms. Although bins of a histogram can be considered as an independent numeric variable, dependencies among the bins might exist that could be useful and neglected when bins are treated individually. Therefore, in this article, we propose extension to the standard RSF algorithm that can handle histogram variables and use it to train survival models for a NOx sensor. The trained model is compared in terms of overall error rate with the standard RSF model where bins of a histogram are treated individually as numeric features. The experiment results shows that the adapted approach outperforms the standard approach and the feature variables considered important are ranked.

  • 30.
    Gurung, Ram B.
    et al.
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Lindgren, Tony
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Boström, Henrik
    An Interactive Visual Tool Enhance Understanding of Random Forest Prediction2020Inngår i: Archives of Data Science, Series A, E-ISSN 2363-9881Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Random forests are known to provide accurate predictions, but the predictions are not easy to understand. In order to provide support for understanding such predictions, an interactive visual tool has been developed. The tool can be used to manipulate selected features to explore what-if scenarios. It exploits the internal structure of decision trees in a trained forest model and presents these information as interactive plots and charts. In addition, the tool presents a simple decision rule as an explanation for the prediction. It also presents the recommendation for reassignments of feature values of the example that leads to change in the prediction to a preferred class. An evaluation of the tool was undertaken in a large truck manufacturing company, targeting a fault prediction of a selected component in trucks. A set of domain experts were invited to use the tool and provide feedback in post-task interviews. The result of this investigation suggests that the tool indeed may aid in understanding the predictions of random forest, and also allows for gaining new insights.

  • 31.
    Gurung, Ram Bahadur
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Learning Decision Trees and Random Forests from Histogram Data: An application to component failure prediction for heavy duty trucks2017Licentiatavhandling, med artikler (Annet vitenskapelig)
    Abstract [en]

    A large volume of data has become commonplace in many domains these days. Machine learning algorithms can be trained to look for any useful hidden patterns in such data. Sometimes, these big data might need to be summarized to make them into a manageable size, for example by using histograms, for various reasons. Traditionally, machine learning algorithms can be trained on data expressed as real numbers and/or categories but not on a complex structure such as histogram. Since machine learning algorithms that can learn from data with histograms have not been explored to a major extent, this thesis intends to further explore this domain.

    This thesis has been limited to classification algorithms, tree-based classifiers such as decision trees, and random forest in particular. Decision trees are one of the simplest and most intuitive algorithms to train. A single decision tree might not be the best algorithm in term of its predictive performance, but it can be largely enhanced by considering an ensemble of many diverse trees as a random forest. This is the reason why both algorithms were considered. So, the objective of this thesis is to investigate how one can adapt these algorithms to make them learn better on histogram data. Our proposed approach considers the use of multiple bins of a histogram simultaneously to split a node during the tree induction process. Treating bins simultaneously is expected to capture dependencies among them, which could be useful. Experimental evaluation of the proposed approaches was carried out by comparing them with the standard approach of growing a tree where a single bin is used to split a node. Accuracy and the area under the receiver operating characteristic (ROC) curve (AUC) metrics along with the average time taken to train a model were used for comparison. For experimental purposes, real-world data from a large fleet of heavy duty trucks were used to build a component-failure prediction model. These data contain information about the operation of trucks over the years, where most operational features are summarized as histograms. Experiments were performed further on the synthetically generated dataset. From the results of the experiments, it was observed that the proposed approach outperforms the standard approach in performance and compactness of the model but lags behind in terms of training time. This thesis was motivated by a real-life problem encountered in the operation of heavy duty trucks in the automotive industry while building a data driven failure-prediction model. So, all the details about collecting and cleansing the data and the challenges encountered while making the data ready for training the algorithm have been presented in detail.

  • 32.
    Hidvegi, Attila
    et al.
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    Eriksson, Daniel
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    Bohm, Christian
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    A Small Portable Test System for the TileCal Digitizer System2008Konferansepaper (Fagfellevurdert)
    Abstract [en]

    The hadronic Tile Calorimeter (TileCal) of the ATLAS detector at LHC has a digitization, pipeline and readout system composed of nearly 2000 boards [1][2], developed and maintained by Stockholm University. Prior to now a rather complex test system been used to verify the functionality of the boards. However this system was developed nearly 10 years ago and now difficult to maintain due to several already obsolete components. A new, simpler, more reliable, and portable test system was therefore initiated. Its components have been chosen to reduce problems with obsolescence, and to allow easy migration to new platforms over the lifetime of the digitizer system.

  • 33.
    Hidvegi, Attila
    et al.
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    Eriksson, Daniel
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    Cederwall, Bo
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    Silverstein, Samuel
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    Bohm, Christian
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    A High-Speed Data Acquisition System for Segmented Ge-Detectors2007Inngår i: Nuclear Science Symposium Conference Record. IEEE, 2007, Vol. 1, s. 536-537Konferansepaper (Fagfellevurdert)
    Abstract [en]

    When using segmented Ge-detectors for gamma ray tracking it is necessary to determine the segment pulse shapes with high accuracy. A high-speed data acquisition system with many channels, high precision and with high sampling rate is required. There are also many other applications for such a system. Our system uses high performance FPGAs (Xilinx Virtex-V [2]) to cope with the data rates delivered by the high speed ADC chosen (Atmel 2Gsps, 10 bits) and to make all the data processing onboard in real time. Each board contains four such ADCs, which can either handle four channels up to full speed, or achieve higher sampling rates with interleaving. The boards can communicate with each other over different types of high-speed communication links. Control and monitoring is implemented with embedded processors. The processed result will be transmitted over Ethernet to final storage. The project introduces many challenging issues: signal integrity, ADC performance, interfacing ADCs to the FPGA, synchronisation of ADCs across the entire system, implementing flexible processing algorithms, high speed interconnection between boards and managing the significant heat generation. This is an ongoing project with interesting potentials for the future.

  • 34.
    Hidvegi, Attila
    et al.
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    Eriksson, Daniel
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    Cederwall, Bo
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    Silverstein, Samuel
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    Bohm, Christian
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    A High-Speed Data Acquisition System for Segmented Ge-Detectors2006Inngår i: Nuclear Science Symposium Conference Record. IEEE, 2006, Vol. 1, s. 999-1001Konferansepaper (Fagfellevurdert)
    Abstract [en]

    When using segmented Ge-detectors for gamma ray tracking it is necessary to determine the segment pulse shapes with high accuracy. A high-speed data acquisition system with many channels, high precision and with high sampling rate is required. To find the optimum performance, we are investigating what can be achieved by a system with extremely high sampling rates, 10 bits @2 GS/s. There are many other applications for such a system. Higher sampling rates usually mean lower bit resolution of the ADC, but with oversampling we expect to achieve a very good energy and time resolution. The system uses high performance FPGAs (Xilinx Virtex-IV) to cope with the data rates delivered by the high speed ADCs and to make all the data processing onboard in real time. Control and monitoring is implemented in an embedded soft processor. This processor is also in charge of the offboard gigabit Ethernet communication. The final system will consist of several separate boards, each with a number of input channels that will have to communicate with each other in real time over a high-speed communication link. The processed result will be transmitted over Ethernet to final storage. The project introduces many challenging issues, which are being addressed in turn with different prototype designs. These issues are: the ADC performance, interfacing the ADCs to the FPGA, implementing the flexible processing algorithms and high speed interconnection between the boards.

  • 35.
    Hidvegi, Attila
    et al.
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    Gessler, Patrick
    Deutsches Elektronen-Synchrotron (DESY).
    Rehlich, Kay
    Deutsches Elektronen-Synchrotron (DESY),.
    Bohm, Christian
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    An Advanced FPGA Based Phase-Lock-Loop System as an Alternative Solution for the XFEL Timing System2009Inngår i: Nuclear Science Symposium Conference Record (NSS/MIC), 2009 IEEE, 2009, s. 1871-1872Konferansepaper (Fagfellevurdert)
    Abstract [en]

    The European XFEL project requires a high-speed, very precise clock and timing distribution over large distances. A prototype system which fulfils current requirements that uses high-end components has just been completed and is being tested. However, the system is quite complicated and the boards are very complex, being designed using the small micro-TCA form factor. A way to simplify the system, and perhaps reduce cost, would be to implement an Advanced PLL in the programmable logic of an FPGA, which then would control an external VCO. By doing so several major issues could be resolved at the same time, while making more use of the advanced features of modern FPGAs. Such a system could be an alternative solution to the complex part of the Timing and Triggering System for XFEL.

  • 36.
    Hidvegi, Attila
    et al.
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    Gessler, Patrick
    Deutsches Elektronen-Synchrotron (DESY).
    Rehlich, Kay
    Deutsches Elektronen-Synchrotron (DESY),.
    Bohm, Christian
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    Timing and Triggering System Prototype for the XFEL Project2010Inngår i: IEEE Transactions on Nuclear Science, ISSN 0018-9499, E-ISSN 1558-1578, Vol. 58, nr 4, s. 1852-1856Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    The European X-ray Free Electron Laser (XFEL) [1] at DESY in Hamburg will begin operating in the next few years, enabling new, ground-breaking research opportunities. The entire system requires very precise clock and trigger distribution, synchronous with the 1.3 GHz system RF-frequency, over distances of more than 3.4 km. The new experiment demanded features that other commercial solutions could not yet provide. Researchers at Stockholm University and DESY have developed a prototype for the timing system of XFEL. It has been decided that XFEL will use modern ATCA and Micro-TCA systems because of their advanced features and reliability. The timing system has been adapted to the Micro-TCA bus standard and also follows the new upcoming xTCA for physics standard. The prototype is fully functional and complete. It will serve as a platform for future development of the whole timing system. This paper describes the hardware design and some test results using the prototype board.

  • 37.
    Hidvegi, Attila
    et al.
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    Gessler, Patrick
    Deutsches Elektronen-Synchrotron (DESY).
    Rehlich, Kay
    Deutsches Elektronen-Synchrotron (DESY),.
    Bohm, Christian
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    Rydström, Stefan
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    A System for Distributing High-Speed Synchronous High-Precision Clock and Trigger Data over Large Distances2008Inngår i: Nuclear Science Symposium Conference Record, 2008. NSS '08. IEEE, 2008, s. 2581-2584Konferansepaper (Fagfellevurdert)
    Abstract [en]

    The distribution of precise timing throughout the European X-ray Free Electron Laser project [1] (XFEL) and its triggering system is a very challenging part of the system design. ADCs in data acquisition systems and DACs in control systems will require very high precision clocks. The clocks need to be synchronous to each other, both in frequency and phase, with a jitter performance better than 5 ps (RMS). At some high-speed ADCs it might even need a precision down to 0.1ps. The frequencies that must be available are the main 1.3 GHz and some frequencies below, which are all derived from the main frequency. The phase needs to be adjustable to allow synchronization between separate devices.

  • 38.
    Homem, Irvin
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Advancing Automation in Digital Forensic Investigations2018Doktoravhandling, med artikler (Annet vitenskapelig)
    Abstract [en]

    Digital Forensics is used to aid traditional preventive security mechanisms when they fail to curtail sophisticated and stealthy cybercrime events. The Digital Forensic Investigation process is largely manual in nature, or at best quasi-automated, requiring a highly skilled labour force and involving a sizeable time investment. Industry standard tools are evidence-centric, automate only a few precursory tasks (E.g. Parsing and Indexing) and have limited capabilities of integration from multiple evidence sources. Furthermore, these tools are always human-driven.

    These challenges are exacerbated in the increasingly computerized and highly networked environment of today. Volumes of digital evidence to be collected and analyzed have increased, and so has the diversity of digital evidence sources involved in a typical case. This further handicaps digital forensics practitioners, labs and law enforcement agencies, causing delays in investigations and legal systems due to backlogs of cases. Improved efficiency of the digital investigation process is needed, in terms of increasing the speed and reducing the human effort expended. This study aims at achieving this time and effort reduction, by advancing automation within the digital forensic investigation process.

    Using a Design Science research approach, artifacts are designed and developed to address these practical problems. Summarily, the requirements, and architecture of a system for automating digital investigations in highly networked environments are designed. The architecture initially focuses on automation of the identification and acquisition of digital evidence, while later versions focus on full automation and self-organization of devices for all phases of the digital investigation process. Part of the remote evidence acquisition capability of this system architecture is implemented as a proof of concept. The speed and reliability of capturing digital evidence from remote mobile devices over a client-server paradigm is evaluated. A method for the uniform representation and integration of multiple diverse evidence sources for enabling automated correlation, simple reasoning and querying is developed and tested. This method is aimed at automating the analysis phase of digital investigations. Machine Learning (ML)-based triage methods are developed and tested to evaluate the feasibility and performance of using such techniques to automate the identification of priority digital evidence fragments. Models from these ML methods are evaluated in identifying network protocols within DNS tunneled network traffic. A large dataset is also created for future research in ML-based triage for identifying suspicious processes for memory forensics.

    From an ex ante evaluation, the designed system architecture enables individual devices to participate in the entire digital investigation process, contributing their processing power towards alleviating the burden on the human analyst. Experiments show that remote evidence acquisition of mobile devices over networks is feasible, however a single-TCP-connection paradigm scales poorly. A proof of concept experiment demonstrates the viability of the automated integration, correlation and reasoning over multiple diverse evidence sources using semantic web technologies. Experimentation also shows that ML-based triage methods can enable prioritization of certain digital evidence sources, for acquisition or analysis, with up to 95% accuracy.

    The artifacts developed in this study provide concrete ways to enhance automation in the digital forensic investigation process to increase the investigation speed and reduce the amount of costly human intervention needed.

     

  • 39.
    Högås, Marcus
    et al.
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum. SP Technical Research Institute of Sweden, Sweden.
    Rydler, Karl-Erik
    Stenarson, Jörgen
    Yhland, Klas
    Analytic Solution of the Magnetic Field and Inductance in a Coaxial Short Circuit2015Inngår i: IEEE Transactions on Instrumentation and Measurement, ISSN 0018-9456, E-ISSN 1557-9662, Vol. 64, nr 6, s. 1582-1587Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    In this paper, the analytic solution of the magnetic field and the inductance in a coaxial short circuit is derived from Maxwell's equations with the appropriate boundary conditions on the short circuit. Helmholtz equation is thus derived for the magnetic field and is solved by a mode matching technique. By integrating the absolute square of the magnetic field the inductance is obtained. The solution is discussed in the light of earlier approximations and solutions and is evaluated both theoretically and through measurements.

  • 40.
    Hörnstein, Jonas
    et al.
    Institute for System and Robotics (ISR), Instituto Superior Técnico, Lisbon, Portugal.
    Gustavsson, Lisa
    Stockholms universitet, Humanistiska fakulteten, Institutionen för lingvistik, Avdelningen för fonetik.
    Santos-Victor, José
    Institute for System and Robotics (ISR), Instituto Superior Técnico, Lisbon, Portugal.
    Lacerda, Francisco
    Stockholms universitet, Humanistiska fakulteten, Institutionen för lingvistik, Avdelningen för fonetik.
    Multimodal language acquisition based on motor learning and interaction2010Inngår i: From Motor Learning to Interaction Learning in Robots / [ed] Olivier Sigaud & jan Peters, Springer Berlin/Heidelberg, 2010, s. 467-489Kapittel i bok, del av antologi (Annet vitenskapelig)
    Abstract [en]

    In this work we propose a methodology for language acquisition in humanoid robots that mimics that in children. Language acquisition is a complex process that involves mastering several different tasks, such as producing speech sounds, learning how to group different sounds into a consistent and manageable number of classes or speech units, grounding speech, and recognizing the speech sounds when uttered by other persons. While it is not known to which extent those abilities are learned or written in our genetic code, this work aims at two intertwined goals: (i) to investigate how much of linguistic structure that can be derived directly from the speech signal directed to infants by (ii) designing, building and testing biological plausible models for language acquisition in a humanoid robot. We have therefore chosen to avoid implementing any pre-programmed linguistic knowledge, such as phonemes, into these models. Instead we rely on general methods such as pattern matching and hierarchical clustering techniques, and show that it is possible to acquire important linguistic structures directly from the speech signal through the interaction with a caregiver. We also show that this process can be facilitated through the use of motor learning.

  • 41.
    Höök, Kristina
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    A glass box approach to adaptive hypermedia1996Doktoravhandling, monografi (Annet vitenskapelig)
  • 42. Ivanenko, Y.
    et al.
    Nedic, Mitja
    Stockholms universitet, Naturvetenskapliga fakulteten, Matematiska institutionen.
    Gustafsson, M.
    Jonsson, B. L. G.
    Luger, Annemarie
    Stockholms universitet, Naturvetenskapliga fakulteten, Matematiska institutionen.
    Nordebo, S.
    Quasi-Herglotz functions and convex optimization2020Inngår i: Royal Society Open Science, E-ISSN 2054-5703, Vol. 7, nr 1, artikkel-id 191541Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    We introduce the set of quasi-Herglotz functions and demonstrate that it has properties useful in the modelling of non-passive systems. The linear space of quasi-Herglotz functions constitutes a natural extension of the convex cone of Herglotz functions. It consists of differences of Herglotz functions and we show that several of the important properties and modelling perspectives are inherited by the new set of quasi-Herglotz functions. In particular, this applies to their integral representations, the associated integral identities or sum rules (with adequate additional assumptions), their boundary values on the real axis and the associated approximation theory. Numerical examples are included to demonstrate the modelling of a non-passive gain medium formulated as a convex optimization problem, where the generating measure is modelled by using a finite expansion of B-splines and point masses.

  • 43. Ivanenko, Yevhen
    et al.
    Nedic, Mitja
    Stockholms universitet, Naturvetenskapliga fakulteten, Matematiska institutionen.
    Gustafsson, Mats
    Jonsson, B. L. G.
    Luger, Annemarie
    Stockholms universitet, Naturvetenskapliga fakulteten, Matematiska institutionen.
    Nordebo, Sven
    Quasi-Herglotz functions and convex optimization2018Rapport (Annet vitenskapelig)
    Abstract [en]

    We introduce the set of quasi-Herglotz functions and demonstrate that it has properties useful in the modeling of non-passive systems.The linear space of quasi-Herglotz functions constitutes a natural extension of the convex cone of Herglotz functions. It consists of differences of Herglotz functions, and we show that several of the important properties and modeling perspectives of Herglotz functions are inherited by the new set of quasi-Herglotz functions.In particular, this applies to their integral representations, the associated integral identities or sum rules (with adequate additional assumptions), their boundary values on the real axis and the associated approximation theory.Numerical examples are included to demonstrate the modeling of a non-passive gain media formulated as a convex optimization problem,where the generating measure is modeled by using a finite expansion of B-splines and point masses.

  • 44. Ivanov, S. A.
    et al.
    Bush, A. A.
    Hudl, Matthias
    Stockholms universitet, Naturvetenskapliga fakulteten, Fysikum.
    Stash, A. I.
    Andre, G.
    Tellgren, R.
    Cherepanov, V. M.
    Stepanov, A. V.
    Kamentsev, K. E.
    Tokunaga, Y.
    Taguchi, Y.
    Tokura, Y.
    Nordblad, P.
    Mathieu, R.
    Spin and dipole order in geometrically frustrated mixed-valence manganite Pb3Mn7O152016Inngår i: Journal of materials science. Materials in electronics, ISSN 0957-4522, E-ISSN 1573-482X, Vol. 27, nr 12, s. 12562-12573Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    The structural, magnetic, and dielectric properties of Pb3Mn7O15 have been investigated using high-quality single crystals. Pb3Mn7O15 adopts a pseudo-hexagonal orthorhombic structure, with partially filled Kagom, layers connected by ribbons of edge-sharing MnO6 octahedra and intercalated Pb cations. There are 9 inequivalent sites in the structure for the Mn ions, which exist both as Mn3+ and Mn4+. Pb3Mn7O15 undergoes an antiferromagnetic transition below T-N similar to 67 K, with significant geometric frustration. Neutron powder diffraction on crushed single crystals allowed us to determine the low-temperature antiferromagnetic magnetic structure. We discuss the magnetic interaction pathways in the structure and possible interplay between the structural distortions imprinted by the lone-electron pair of Pb2+ cations and Mn3+/Mn4+ charge ordering.

  • 45.
    Jalali, Amin
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Aspect-Oriented Business Process Management2016Doktoravhandling, med artikler (Annet vitenskapelig)
    Abstract [en]

    Separation of concerns has long been considered an effective and efficient strategy to deal with complexity in information systems.One sort of concern, like security and privacy, crosses over other concerns in a system. Such concerns are called cross-cutting concerns.As a result, the realization of these concerns is scattered through the whole system, which makes their management difficult.

    Aspect Orientation is a paradigm in information systems which aims to modularize cross-cutting concerns.This paradigm is well researched in the programming area, where many aspect-oriented programming languages have been developed, e.g., AspectJ.It has also been investigated in other areas, such as requirement engineering and service composition.In the Business Process Management (BPM) area, Aspect Oriented Business Process Modeling aims to specify how this modularization technique can support encapsulating cross-cutting concerns in process models.However, it is not clear how these models should be supported in the whole BPM lifecycle.In addition, the support for designing these models has only been limited to imperative process models that support rigid business processes.Neither has it been investigated how this modularization technique can be supported through declarative or hybrid models to support the separation of cross-cutting concerns for flexible business processes.

    Therefore, this thesis investigates how aspect orientation can be supported over the whole BPM lifecycle using imperative aspect-oriented business process models. It also investigates how declarative and hybrid aspect-oriented business process models can support the separation of cross-cutting concerns in the BPM area.This thesis has been carried out following the design science framework, and the result is presented as a set of artifacts (in the form of constructs, models, methods, and instantiations) and empirical findings.

    The artifacts support modeling, analysis, implementation/configuration, enactment, monitoring, adjustment, and mining cross-cutting concerns while supporting business processes using Business Process Management Systems. Thus, it covers the support for the management of these concerns over the whole BPM lifecycle. The use of these artifacts and their application shows that they can reduce the complexity of process models by separating different concerns.

  • 46.
    Jalali, Amin
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Foundation of Aspect Oriented Business Process Management2012Independent thesis Advanced level (degree of Master (Two Years)), 20 poäng / 30 hpOppgave
    Abstract [en]

    Reducing the complexity in information systems is a main concern on which researchers work. Separation of concerns, also known as the principle of ‘divide and conquer’, has long time been a strategy for dealing with complexity. Two examples of the application of this principle in the area of information system design are the break out the data management into Database Management Systems(DBMSs) and the separation of the business logic from the application logic into Business Process Management Systems (BPMSs). However, separation of cross-cutting concerns from the core-concern of a business process is not yet supported in the Business Process Management (BPM) area. Aspect Oriented principle recommends such a separation. When looking into the business process, several concerns, such as security and privacy, can be identified. Therefore, a formal model that provides a foundation for enabling BPMSs to support separation of concerns in BPM area is needed. This thesis provides a formal model for dealing with separation of concerns in the BPM area. Implementing this model in BPMSs would facilitate the design and implementation of business processes with a lower level of complexity, which in turn would reduce the costs associated with BPM projects. The thesis starts with a literature review on aspect orientation both in programming and in the BPM areas. Based on this study, a list of requirements for an Aspect Oriented Service for BPMSs is compiled. Then a formal model for such a service, fulfilling a set of these requirements, is designed using Coloured Petri Nets and implemented in CPN Tools. The model is evaluated through the execution of a number of scenarios. The solution is also validated through an industrial case study. The results of the case study are presented the direction for future work outlined. The case study demonstrates that separation of concerns through aspect orientation does indeed reduce the complexity of business process models.

  • 47.
    Jalali, Amin
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Service Oriented Modularization using Coloured Petri Nets2012Konferansepaper (Annet (populærvitenskap, debatt, mm))
    Abstract [en]

    Modelling service oriented systems using Coloured Petri Nets usually results in cluttered nets which are hard to understand and modify. This complexity is a result of many interactions among services. This paper presents a method for designing service oriented models using coloured petri nets.This method results us in less complex nets which could be extended easier.The validation of the method is given through demonstrating its impact on defining operational semantics of a service.

  • 48.
    Jalali, Amin
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Supporting Enactment of Aspect Oriented Business Process Models: an approach to separate cross-cutting concerns in action2013Licentiatavhandling, med artikler (Annet vitenskapelig)
    Abstract [en]

    Coping with complexity in Information Systems and Software Engineering is an important issue in both research and industry. One strategy to deal with this complexity is through a separation of concerns, which can result in reducing the complexity, improving the re-usability, and simplifying the evolution.Separation of concerns can be addressed through the Aspect Oriented paradigm. Although this paradigm has been well researched in the field of programming, it is still in a preliminary stage in the area of Business Process Management. While some efforts have been made to propose aspect orientation for business process modeling, it has not yet been investigated how these models should be implemented, configured, run, and adjusted.Such a gap has restrained the enactment of aspect orientated business process models in practice.Therefore, this research enables the enactment of such models to support the separation of cross-cutting concerns in the entire business process management life-cycle.It starts by defining the operational semantics for the Aspect Oriented extension of the Business Process Model and Notation.The semantics specifies how such models can be implemented and configured, which can be used as a blueprint to support the enactment of aspect oriented business process models.The semantics is implemented in the form of artifacts, which are then used in a banking case study to investigate the current modeling technique.This investigation revealed new requirements, which should be considered in aspect oriented modeling approaches.Thus, the current modeling notation has been extended to include new requirements.The extended notation has been formalized, and investigated through re-modeling the processes in the case study. The results from this investigation show the need to refine the separation rules to support the encapsulation of aspects based on different business process perspectives. Therefore, the new refinement is proposed, formalized, and implemented.The implementation is then used as a prototype to evaluate the result through a case study.

  • 49. Jia, Tiekun
    et al.
    Wang, Xiaofeng
    Wang, Weimin
    Wang, Yujiang
    Liao, Guihua
    Xiong, Yan
    Stockholms universitet, Naturvetenskapliga fakulteten, Institutionen för material- och miljökemi (MMK), Avdelningen för oorganisk kemi och strukturkemi.
    Facile Synthesis of Porous SnO(2) Spherical-Like Aggregates and Their Gas Sensing Property2011Inngår i: Integrated Ferroelectrics, ISSN 1058-4587, E-ISSN 1607-8489, Vol. 128, s. 30-36Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Porous spherical-like aggregates synthesis of SnO(2) crystals was achieved via a hydrothermal process in the mixed solvents of water and ethanol. The products were characterized by powder X-ray diffraction, scanning electron microscopy, transmission electron microscopy and Raman measurements. Morphological characterization showed that porous spherical-like aggregates were composed of short nanorods. Three main peaks broadening and shifting to high wavenumber in Raman spectrum implied that the components of the products had a smaller size. The result of the gas sensing measurement indicated that porous spherical-like aggregates had excellent ethanol gas sensing property compared to that of nanoparticles.

  • 50.
    Johansson, Anna-Lena
    Stockholms universitet, Samhällsvetenskapliga fakulteten, Institutionen för data- och systemvetenskap.
    Logic program synthesis using schema instantiation in an interactive environment1995Doktoravhandling, monografi (Annet vitenskapelig)
    Abstract [en]

    The research presented herein proposes a method of program synthesis based on a recursive program schema and performed with an explicit incremental plan as the core of the synthesis. A partial prototype has been built in order to be able to actually perform syntheses according to the method. The presentation of the method is accompanied by examples of performed syntheses.

    The program schemata proposed are simple and based directly on the inductive definition of a data structure which is a basis for the program. The replacement rule for instantiating the schemata is also simple. The simple schema and the simple rule should make the method easy to understand.

    In situations when program sentences in a program are similar, meaning that there are similarities in their derivations, we would like, if feasible, to avoid constructing all the corresponding derivations. A method to decide when a definition yields analogous sentences and which also produces a substitution defining the analogy is presented. As a result we can replace a derivation by a substitution, making the onus of synthesis easier. The method has been implemented as a part of the system for interactive synthesis support.

    The synthesised programs are discussed with three logical concerns in mind as follows: partial correctness, completeness and totality. The synthesised normal programs are always logical consequences of the specification. Whenever the programs and their goals are definite the programs are always partially correct. From a study of the synthesis emerges a sufficient condition for programs that use negation to be partially correct and for definite or normal programs to be complete. Sufficient conditions for the derived relation to be total can be used to show that the program is defined for every element of the recursive set.

12 1 - 50 of 93
RefereraExporteraLink til resultatlisten
Permanent link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf