From MFKP_wiki

Jump to: navigation, search

Selection: with tag software-engineering [73 articles] 


Beyond ∖newcommand with xparse

TUGboat, Vol. 31, No. 1. (2010), pp. 80-83


[Excerpt: Introduction] The LATEX 2ε \newcommand macro is most LATEX users’ first choice for creating macros. As well as the ‘sanity checks’ it carries out, the ability to define macros with an optional argument is very useful. However, to go beyond using a single optional argument, or to create more complex input syntaxes, LATEX 2ε users have to do things ‘by hand’ using \def or load one of the packages which extend \newcommand (for example twoopt (Oberdiek, 2008)). [\n] As part of the ...


Software engineering for computational science: past, present, future

Computing in Science & Engineering (2018), pp. 1-1,


While the importance of in silico experiments for the scientific discovery process increases, state-of-the-art software engineering practices are rarely adopted in computational science. To understand the underlying causes for this situation and to identify ways for improving the current situation, we conduct a literature survey on software engineering practices in computational science. As a result of our survey, we identified 13 recurring key characteristics of scientific software development that can be divided into three groups: characteristics that results (1) from the ...


Legacy system anti-patterns and a pattern-oriented migration response

In Systems Engineering for Business Process Change (2000), pp. 239-250,
edited by Peter Henderson


Mature information systems grow old disgracefully as successive waves of hacking result in accidental architectures which resist the reflection of ongoing business process change. Such petrified systems are termed legacy systems. Legacy systems are simultaneously business assets and business liabilities. Their hard-won dependability and accurate reflection of tacit business knowledge prevent us from undertaking green-field development of replacement systems. Their resistance to the reflection of business process change prevents us from retaining them. Consequently, we are drawn in this paper to ...


Ten simple rules for making research software more robust

PLOS Computational Biology, Vol. 13, No. 4. (13 April 2017), e1005412,


[Abstract] Software produced for research, published and otherwise, suffers from a number of common problems that make it difficult or impossible to run outside the original institution or even off the primary developer’s computer. We present ten simple rules to make such software robust enough to be run by anyone, anywhere, and thereby delight your users and collaborators. [Author summary] Many researchers have found out the hard way that there’s a world of difference between “works for me on my machine” and “works for ...


A comparative evaluation of core kernel features of the recent Linux, FreeBSD, Solaris and Windows operating systems

In Proceedings of The World Congress on Engineering 2016 (2016)


The paper compares core kernel architecture and functionality of four modern operating systems. The subsystems examined are process / thread architecture, scheduling and interrupt handling. Linux, Solaris and FreeBSD have a lot of similarities, owning Unix roots, but also have some notable differences. However, Windows is significantly different, being a radical non-Unix design. The paper compares some aspects of the Unix-like approaches of Linux/Solaris/FreeBSD with Windows, emphasizing the consequences of their different design decisions, and presents some comparative performance results, using Java benchmarks. [Excerpt: Conclusions] The paper aimed to provide an insight to ...


Multi-dimensional weighted median: the module "wmedian" of the Mastrave modelling library

In Semantic Array Programming with Mastrave - Introduction to Semantic Computational Modelling (2012)


Weighted median (WM) filtering is a well known technique for dealing with noisy images and a variety of WM-based algorithms have been proposed as effective ways for reducing uncertainties or reconstructing degraded signals by means of available information with heterogeneous reliability. Here a generalized module for applying weighted median filtering to multi-dimensional arrays of information with associated multi-dimensional arrays of corresponding weights is presented. Weights may be associated to single elements or to groups of elements along given dimensions of the ...


Research software sustainability: report on a knowledge exchange workshop

(February 2016)


[Excerpt: Executive summary] Without software, modern research would not be possible. Understandably, people tend to marvel at results rather than the tools used in their discovery, which means the fundamental role of software in research has been largely overlooked. But whether it is widely recognised or not, research is inexorably connected to the software that is used to generate results, and if we continue to overlook software we put at risk the reliability and reproducibility of the research itself. [\n] The adoption of software is accompanied by new risks - many of ...


They write the right stuff

Fast Company, Vol. 6 (December 1996), 28121


[Excerpt] As the 120-ton space shuttle sits surrounded by almost 4 million pounds of rocket fuel, exhaling noxious fumes, visibly impatient to defy gravity, its on-board computers take command. Four identical machines, running identical software, pull information from thousands of sensors, make hundreds of milli-second decisions, vote on every decision, check with each other 250 times a second. A fifth computer, with different software, stands by to take control should the other four malfunction. [\n] At T-minus 6.6 seconds, if the pressures, pumps, and temperatures are nominal, ...


A (partial) introduction to software engineering practices and methods



[Excerpt: Introduction] Software engineering is concerned with all aspects of software production from the early stages of system specification through to maintaining the system after it has gone into use. [...] [\n] [...] As a discipline, software engineering has progressed very far in a very short period of time, particularly when compared to classical engineering field (like civil or electrical engineering). In the early days of computing, not much more than 50 years ago, computerized systems were quite small. Most of the programming was done by scientists trying to ...


(INRMM-MiD internal record) List of keywords of the INRMM meta-information database - part 37

(February 2014)
Keywords: inrmm-list-of-tags   sequoia-sempervirens   sequoiadendron-giganteum   serbia   serbian-spruce   serendipity   serotinous-pine   service-as-a-software-substitute   service-tree   services   sesia-apiformis   sex-ratio   shade-tolerance   shake   shallow-soil   shape-index   shape-semantics   sharka-disease   short-rotation-forestry   short-term-vs-long-term   shrubs   si   sicily   sieve   sieve-parameter-training-architecture   sigma-pi-networks   silent-faults   silo-thinking   silver-bullet   silver-fir   silver-fir-decline   silvical-characteristics   silvics   silviculture   similarity   simple-sequence-repeats   simulation   single-nucleotide-polymorphism   sismic-hazard   site-quality   sitka-spruce   situational-awareness   size-asymmetry   slash-management   slavery   slope   slope-stability   slovakia   slovenia   slovenian-alps   smoke   smooth-transition   smyrnium-perfoliatum   snow   snow-avalances   so2   soc   social-engineering-risk   social-learning   social-media   social-system   society   socratea-exorrhiza   sodium   soft-constraint   soft-systems-approach   softw   software-control   software-engineering   software-errors   software-evolution   software-evolvability   software-libraries   software-patents   software-quality   software-security   software-uncertainty   software-validity   software-verification   soil   soil-carbon   soil-compactation   soil-conditions   soil-erosion   soil-evolution   soil-fertility   soil-food   soil-formation   soil-hydrophobicity   soil-loss   soil-microbial-properties   soil-moisture   soil-pollution   soil-resources   soil-restoration   soil-sealing   soil-stabilization   soil-thickness   soil-vs-vegetation  


List of indexed keywords within the transdisciplinary set of domains which relate to the Integrated Natural Resources Modelling and Management (INRMM). In particular, the list of keywords maps the semantic tags in the INRMM Meta-information Database (INRMM-MiD). [\n] The INRMM-MiD records providing this list are accessible by the special tag: inrmm-list-of-tags ( ). ...


A practical approach to programming with assertions

IEEE Transactions on Software Engineering, Vol. 21, No. 1. (January 1995), pp. 19-31,


Embedded assertions have been recognized as a potentially powerful tool for automatic runtime detection of software faults during debugging, testing, maintenance and even production versions of software systems. Yet despite the richness of the notations and the maturity of the techniques and tools that have been developed for programming with assertions, assertions are a development tool that has seen little widespread use in practice. The main reasons seem to be that (1) previous assertion processing tools did not integrate easily with existing programming environments, and (2) it is not ...


The pathologies of big data

Commun. ACM, Vol. 52, No. 8. (August 2009), pp. 36-44,


Scale up your datasets enough and your apps come undone. What are the typical problems and where do the bottlenecks surface? ...


Automated group facilitation for gathering wide audience end-user requirements

In System Sciences (HICSS), 2013 46th Hawaii International Conference on (January 2013), pp. 195-204,


The System development projects continue to fail at unacceptable rates. Including a wide array of users in the requirements development process for a wide-audience system can help to increase system success. Facilitated group workshops can effectively and efficiently gather requirements from several different users. To decrease cost and increase the number of potential workshop participants, we designed an embodied agent facilitator to guide groups through the facilitation process. We extend previous research which found human facilitated prompting to be effective at ...


Data Cognitive Complexity: A New Measure

In SOURCE Proceedings of the World Congress on Engineering 2011, Vol. 1 (2011), 285


There are different facets of software complexity, some of which have been computed using widely accepted metrics like cyclomatic complexity, data/information flow metrics, but very less attempts have been made to measure the cognitive aspect of the complexity. The human mind's efforts needed for the comprehension of the source code reflect a different dimension of complexity, which is being measured in this paper. There are two aspects of the readability of the source code. One of these is spatial aspect and ...


Measurement of the cognitive functional complexity of software

In Cognitive Informatics, 2003. Proceedings. The Second IEEE International Conference on (August 2003), pp. 67-74,


One of the central problems in software engineering is its inherited complexity. It is recognized that cognitive informatics plays an important role in understanding the fundamental characteristics of software. This paper models the cognitive weights of basic control structures of software, and develops a new concept of cognitive functional size for measuring software complexity. Comparative case studies between the cognitive functional size and physical sizes of 20 programs are conducted. It is found that for a given design, although the physical ...


Approaches to Compute Workflow Complexity

In The Role of Business Processes in Service Oriented Architectures, No. 06291. (2006)


During the last 20 years, complexity has been an interesting topic that has been investigated in many fields of science, such as biology, neurology, software engineering, chemistry, psychology, and economy. A survey of the various approaches to understand complexity has lead sometimes to a measurable quantity with a rigorous but narrow definition and other times as merely an ad hoc label. In this paper we investigate the complexity concept to avoid a vague use of the term `complexity' in workflow designs. ...


A framework and methodology for studying the causes of software errors in programming systems

Journal of Visual Languages & Computing, Vol. 16, No. 1-2. (February 2005), pp. 41-84,


An essential aspect of programmers’ work is the correctness of their code. This makes current HCI techniques ill-suited to analyze and design the programming systems that programmers use everyday, since these techniques focus more on problems with learnability and efficiency of use, and less on error-proneness. We propose a framework and methodology that focuses specifically on errors by supporting the description and identification of the causes of software errors in terms of chains of cognitive breakdowns. The framework is based on ...


How do scientists develop and use scientific software?

In Software Engineering for Computational Science and Engineering, 2009. SECSE '09. ICSE Workshop on (May 2009), pp. 1-8,


New knowledge in science and engineering relies increasingly on results produced by scientific software. Therefore, knowing how scientists develop and use software in their research is critical to assessing the necessity for improving current development practices and to making decisions about the future allocation of resources. To that end, this paper presents the results of a survey conducted online in October-December 2008 which received almost 2000 responses. Our main conclusions are that (1) the knowledge required to develop and use scientific ...


The Evolution of the Laws of Software Evolution: A Discussion Based on a Systematic Literature Review

ACM Comput. Surv., Vol. 46, No. 2. (December 2013),


After more than 40 years of life, software evolution should be considered as a mature field. However, despite such a long history, many research questions still remain open, and controversial studies about the validity of the laws of software evolution are common. During the first part of these 40 years, the laws themselves evolved to adapt to changes in both the research and the software industry environments. This process of adaption to new paradigms, standards, and practices stopped about 15 years ...


A view of 20th and 21st century software engineering

In Proceedings of the 28th International Conference on Software Engineering (2006), pp. 12-29,


George Santayana's statement, "Those who cannot remember the past are condemned to repeat it," is only half true. The past also includes successful histories. If you haven't been made aware of them, you're often condemned not to repeat their successes.In a rapidly expanding field such as software engineering, this happens a lot. Extensive studies of many software projects such as the Standish Reports offer convincing evidence that many projects fail to repeat past successes.This paper tries to identify at least some ...


Scientific Computing's Productivity Gridlock: How Software Engineering Can Help

Computing in Science & Engineering, Vol. 11, No. 6. (01 November 2009), pp. 30-39,


Hardware improvements do little to improve real productivity in scientific programming. Indeed, the dominant barriers to productivity improvement are now in the software processes. To break the gridlock, we must establish a degree of cooperation and collaboration with the software engineering community that does not yet exist. The accumulated technologies and practices of general computer science and software engineering have failed to impact scientific programming's productivity gridlock. To address this productivity crisis, the computer science and software engineering communities must better ...


The documentary structure of source code

Information and Software Technology, Vol. 44, No. 13. (October 2002), pp. 767-782,


Many tools designed to help programmers view and manipulate source code exploit the formal structure of the programming language. Language-based tools use information derived via linguistic analysis to offer services that are impractical for purely text-based tools. In order to be effective, however, language-based tools must be designed to account properly for the documentary structure of source code: a structure that is largely orthogonal to the linguistic but no less important. Documentary structure includes, in addition to the language text, all ...


Extreme terseness: some languages are more agile than others

Lecture Notes in Computer Science In Extreme Programming and Agile Processes in Software Engineering, Vol. 2675 (24 June 2003), pp. 334-336,


While XP principles are independent of the languages in which software is developed, we can distinguish properties of programming languages that affect the agility of development. Some languages are inherently more agile than others, and the experience of developing software in these languages reflects this. A family of languages descended from the mathematics notation developed at Harvard in the 1950s by Iverson [1] shares properties of extreme terseness and abstractive power with weak data typing. The history of software development in ...


Dependently typed array programs don’t go wrong

The Journal of Logic and Algebraic Programming, Vol. 78, No. 7. (08 August 2009), pp. 643-664,


The array programming paradigm adopts multidimensional arrays as the fundamental data structures of computation. Array operations process entire arrays instead of just single elements. This makes array programs highly expressive and introduces data parallelism in a natural way. Array programming imposes non-trivial structural constraints on ranks, shapes, and element values of arrays. A prominent example where such constraints are violated are out-of-bound array accesses. Usually, such constraints are enforced by means of run time checks. Both the run time overhead inflicted ...


Software evolution - Background, theory, practice

Information Processing Letters, Vol. 88, No. 1-2. (17 October 2003), pp. 33-44,


This paper opens with a brief summary of some 30 years of study of the software evolution phenomenon. The results of those studies include the SPE program classification, a principle of software uncertainty and laws of E-type software evolution. The laws were termed so because they encapsulate phenomena largely independent of the people, the organisations and the domains involved in the evolution of the E-type systems studied. Recent studies have refined earlier conclusions, yielded practical guidelines for software evolution management and ...


Macro-level software evolution: a case study of a large software compilation

Empirical Software Engineering In Empirical Software Engineering, Vol. 14, No. 3. (1 June 2009), pp. 262-285,


Software evolution studies have traditionally focused on individual products. In this study we scale up the idea of software evolution by considering software compilations composed of a large quantity of independently developed products, engineered to work together. With the success of libre (free, open source) software, these compilations have become common in the form of ‘software distributions’, which group hundreds or thousands of software applications and libraries into an integrated system. We have performed an exploratory case study on one of ...


Best Practices for Scientific Computing

PLoS Biology, Vol. 12, No. 1. (26 Sep 2013), e1001745,


Scientists spend an increasing amount of time building and using software. However, most scientists are never taught how to do this efficiently. As a result, many are unaware of tools and practices that would allow them to write more reliable and maintainable code with less effort. We describe a set of best practices for scientific software development that have solid foundations in research and experience, and that improve scientists' productivity and the reliability of their software. ...


No Silver Bullet Essence and Accidents of Software Engineering

Computer In Computer, Vol. 20, No. 4. (07 April 1987), pp. 10-19,


Fashioning complex conceptual constructs is the essence; accidental tasks arise in representing the constructs in language. Past progress has so reduced the accidental tasks that future progress now depends upon addressing the essence. Of all the monsters that fill the nightmares of our folklore, none terrify more than werewolves, because they transform unexpectedly from the familiar into horrors. For these, one seeks bullets of silver that can magically lay them to rest. The familiar software project, at least as seen by the ...


Behavioural contracts for a sound assembly of components

Lecture Notes in Computer Science In Formal Techniques for Networked and Distributed Systems - FORTE 2003, Vol. 2767 (2003), pp. 111-126,


Component based design is a new methodology for the construction of distributed systems and applications. In this new setting, a system is built by the assembly of (pre)-existing components. Remains the problem of the compositional verification of such systems. We investigate methods and concepts for the provision of “sound” assemblies. We define an abstract, dynamic, multi-threaded, component model, encompassing both client/server and peer to peer communication patterns. We define a behavioural interface type language endowed with a (decidable) set of interface ...


Why software fails [software failure]

IEEE Spectr. In Spectrum, IEEE, Vol. 42, No. 9. (06 September 2005), pp. 42-49,


Most IT experts agree that software failures occur far more often than they should despite the fact that, for the most part, they are predictable and avoidable. It is unfortunate that most organizations don't see preventing failure as an urgent matter, even though that view risks harming the organization and maybe even destroying it. Because software failure has tremendous implications for business and society, it is important to understand why this attitude persists. ...


Software engineering in an uncertain world

In Proceedings of the FSE/SDP workshop on Future of software engineering research (2010), pp. 125-128,


In this paper, we argue that the reality of today's software systems requires us to consider uncertainty as a first-class concern in the design, implementation, and deployment of those systems. We further argue that this induces a paradigm shift, and a number of research challenges that must be addressed. ...


Facts and fallacies of software engineering

(07 November 2003)
Keywords: myths   softw   software-engineering  


The practice of building software is a "new kid on the block" technology. Though it may not seem this way for those who have been in the field for most of their careers, in the overall scheme of professions, software builders are relative "newbies." In the short history of the software field, a lot of facts have been identified, and a lot of fallacies promulgated. Those facts and fallacies are what this book is about. There's a problem with those facts---and, ...


The Computing Machines in the Future

In Nishina Memorial Lectures, Vol. 746 (2008), pp. 99-114,


This address was presented by Richard P. Feynman as the Nishina Memorial Lecture at Gakushuin University (Tokyo), on August 9, 1985. ...


The Watts New Collection: Columns by the SEI’s Watts Humphrey

No. CMU/SEI-2009-SR-024. (2009)


Since June 1998, Watts Humphrey has taken readers of news@sei and its predecessor SEI Interactive on a process-improvement journey, step by step, in his column Watts New. The column has explored the problem of setting impossible dates for project completion, planning as a team using TSP, the importance of removing software defects, applying discipline to software development, approaching managers about a process improvement effort, and making a persuasive case for implementing it. After 11 years, Watts is taking a well-deserved retirement from writing the quarterly column. But ...


Teaching Real-World Programming

In BLOG@CACM (January 2013)

Free and Open Source Software underpinning the European Forest Data Centre

Geophysical Research Abstracts In European Geosciences Union (EGU) General Assembly 2013, Vol. 15 (2013), 12101,


Worldwide, governments are growingly focusing on free and open source software (FOSS) as a move toward transparency and the freedom to run, copy, study, change and improve the software. The European Commission (EC) is also supporting the development of FOSS [...]. In addition to the financial savings, FOSS contributes to scientific knowledge freedom in computational science (CS) and is increasingly rewarded in the science-policy interface within the emerging paradigm of open science. Since complex computational science applications may be affected by ...


Large-scale complex IT systems

Commun. ACM, Vol. 55, No. 7. (July 2012), pp. 71-77,


The reductionism behind today's software-engineering methods breaks down in the face of systems complexity. ...


Toward open science at the European scale: Geospatial Semantic Array Programming for integrated environmental modelling

Geophysical Research Abstracts In European Geosciences Union (EGU) General Assembly 2013, Vol. 15 (2013), 13245,


[Excerpt] Interfacing science and policy raises challenging issues when large spatial-scale (regional, continental, global) environmental problems need transdisciplinary integration within a context of modelling complexity and multiple sources of uncertainty. This is characteristic of science-based support for environmental policy at European scale, and key aspects have also long been investigated by European Commission transnational research. Approaches (either of computational science or of policy-making) suitable at a given domain-specific scale may not be appropriate for wide-scale transdisciplinary modelling for environment (WSTMe) and ...


Building diverse computer systems

In Operating Systems, 1997., The Sixth Workshop on Hot Topics in (May 1997), pp. 67-72,


Diversity is an important source of robustness in biological systems. Computers, by contrast, are notable for their lack of diversity. Although homogeneous systems have many advantages, the beneficial effects of diversity in computing systems have been overlooked, specifically in the area of computer security. Several methods of achieving software diversity are discussed based on randomizations that respect the specified behavior of the program. Such randomization could potentially increase the robustness of software systems with minimal impact on convenience, usability, and efficiency. ...


Software-level soft-error mitigation techniques

In Soft Errors in Modern Electronic Systems, Vol. 41 (2011), pp. 253-285,


Several application domains exist, where the effects of Soft Errors on processor-based systems cannot be faced by acting on the hardware (either by changing the technology, or the components, or the architecture, or whatever else). In these cases, an attractive solution lies in just modifying the software: the ability to detect and possibly correct errors is obtained by introducing redundancy in the code and in the data, without modifying the underlying hardware. This chapter provides an overview of the methods resorting ...


Computational Sustainability: Computational methods for a sustainable environment, economy, and society

The Bridge, Vol. 39, No. 4. (2009), pp. 5-13


The dramatic depletion of natural resources in the last century now threatens our planet and the livelihood of future generations. Our Common Future, a report by the World Commission on Environment and Develop- ment published in 1987, introduced for the first time the notion of “sus- tainable development: development that meets the needs of the present without compromising the ability of future generations to meet their needs” (UNEP, 1987). The concerns raised in that report were reiterated by the Intergovernmental Panel on Climate Change (IPCC, 2007). In ...


Dealing with Risk in Scientific Software Development

Software, IEEE In Software, IEEE, Vol. 25, No. 4. (24 July 2008), pp. 21-28,


The development of scientific software involves risk in the underlying theory, its implementation, and its use. Through a series of interviews, the authors explored how research scientists at two Canadian universities developed their software. These interviews indicated that the scientists used a set of strategies to address risk. They also suggested where the software engineering community could perform research focused on specific problems faced by scientific software developers. ...


Algorithm Engineering - An Attempt at a Definition

In Efficient Algorithms, Vol. 5760 (2009), pp. 321-340,


This paper defines algorithm engineering as a general methodology for algorithmic research. The main process in this methodology is a cycle consisting of algorithm design, analysis, implementation and experimental evaluation that resembles Popper’s scientific method. Important additional issues are realistic models, algorithm libraries, benchmarks with real-world problem instances, and a strong coupling to applications. Algorithm theory with its process of subsequent modelling, design, and analysis is not a competing approach to algorithmics but an important ingredient of algorithm engineering. ...


Computer science is not a science

Commun. ACM, Vol. 56, No. 1. (January 2013), pp. 8-9,


An abstract is not available. ...


Exceptional C++



Exceptional C++ shows by example how to go about sound software engineering in standard C++. Do you enjoy solving thorny C++ problems and puzzles? Do you relish writing robust and extensible code? Then take a few minutes and challenge yourself with some tough C++ design and programming problems. The puzzles and problems in Exceptional C++ not only entertain, they will help you hone your skills to become the sharpest C++ programmer you can be. Many of these problems are culled from ...


Subjective evaluation of software evolvability using code smells: an empirical study

Empirical Software Engineering In Empirical Software Engineering, Vol. 11, No. 3. (1 September 2006), pp. 395-431,


This paper presents the results of an empirical study on the subjective evaluation of code smells that identify poorly evolvable structures in software. We propose use of the term software evolvability to describe the ease of further developing a piece of software and outline the research area based on four different viewpoints. Furthermore, we describe the differences between human evaluations and automatic program analysis based on software evolvability metrics. The empirical component is based on a case study in a Finnish ...


Computational science: ...Error

Nature, Vol. 467, No. 7317. (14 October 2010), pp. 775-777,


…why scientific programming does not compute When hackers leaked thousands of e-mails from the Climatic Research Unit (CRU) at the University of East Anglia in Norwich, UK, last year, global-warming sceptics pored over the documents for signs that researchers had manipulated data. No such evidence emerged, but the e-mails did reveal another problem — one described by a CRU employee named “Harry”, who often wrote of his wrestling matches with wonky computer software. ...


Social processes and proofs of theorems and programs

Commun. ACM, Vol. 22, No. 5. (May 1979), pp. 271-280,


It is argued that formal verifications of programs, no matter how obtained, will not play the same key role in the development of computer science and software engineering as proofs do in mathematics. Furthermore the absence of continuity, the inevitability of change, and the complexity of specification of significantly many real programs make the formal verification process difficult to justify and manage. It is felt that ease of formal verification should not dominate program language design. ...


Software Challenges in Achieving Space Safety

Journal of the British Interplanetary Society, Vol. 62 (2009), pp. 265-272


Techniques developed for hardware reliability and safety do not work on software-intensive systems. This is because software does not satisfy the assumptions underlying these techniques. The new problems and why the current approaches are not effective for complex, software-intensive systems are described in the first part of the article. A new approach to hazard analysis and safety-driven design is then presented. Rather than being based on reliability theory, as most current safety engineering techniques are, the new approach builds on system and control theory. ...


Notes on structured programming

In Structured programming (1972), pp. 1-82
This page of the database may be cited as:
Integrated Natural Resources Modelling and Management - Meta-information Database.

Result page: 1 2 Next

Publication metadata

Bibtex, RIS, RSS/XML feed, Json, Dublin Core

Meta-information Database (INRMM-MiD).
This database integrates a dedicated meta-information database in CiteULike (the CiteULike INRMM Group) with the meta-information available in Google Scholar, CrossRef and DataCite. The Altmetric database with Article-Level Metrics is also harvested. Part of the provided semantic content (machine-readable) is made even human-readable thanks to the DCMI Dublin Core viewer. Digital preservation of the meta-information indexed within the INRMM-MiD publication records is implemented thanks to the Internet Archive.
The library of INRMM related pubblications may be quickly accessed with the following links.
Search within the whole INRMM meta-information database:
Search only within the INRMM-MiD publication records:
Full-text and abstracts of the publications indexed by the INRMM meta-information database are copyrighted by the respective publishers/authors. They are subject to all applicable copyright protection. The conditions of use of each indexed publication is defined by its copyright owner. Please, be aware that the indexed meta-information entirely relies on voluntary work and constitutes a quite incomplete and not homogeneous work-in-progress.
INRMM-MiD was experimentally established by the Maieutike Research Initiative in 2008 and then improved with the help of several volunteers (with a major technical upgrade in 2011). This new integrated interface is operational since 2014.