218 research outputs found

    Application of Automation in Building Design According to Structural and Circular Economy Principles

    Get PDF
    En stor del av klimagassutslippene som kommer fra bygg- og anleggsindustrien skyldes produksjon og bruk av materialer. På grunn av dette er det et akutt behov for å finne løsninger som kan drastisk redusere disse utslippene uten at det går på bekostning av den strukturelle integriteten til konstruksjonene. Den teknologiske fremdriften i industrien har gjort det mulig å komme frem til slike løsninger, og sammen med muligheten til å automatisere deler av design prosessen, kan løsningene også bidra til å gjøre denne prosessen mer effektiv. Denne masteroppgaven foreslår en mulig løsning som reduserer miljøpåvirkningen til bygninger ved å implementere automatisering i design prosessen. Grunnlaget for denne løsningen er en plug-in lagd i Visual Studio 2022, som inneholder komponenter som automatiserer design av bygningene. Komponentene er lagd ved bruk av programmeringsspråket C#. Plug-inen importeres til Grasshopper, hvor komponentene blir brukt til å lage algoritmer til en CO2 utslippsanalyse og en FEM analyse, som brukes til å analysere konstruksjonselementene i bygningen i forhold til CO2 utslipp og utnyttelse. Resultatene, i tillegg til variablene som kontrollerer hvordan bygningene ser ut, kobles sammen med en optimaliseringskomponent som itererer gjennom en rekke ulike bygningsdesign og materialer, hvor det beste alternativet blir presentert. Som et resultat av dette, gir denne løsningen mer bærekraftige bygningsdesign, samtidig som den reduserer det manuelle, tidkrevende arbeidet. De fire case-studiene i denne oppgaven fremhever ulike aspekter ved denne løsningen. Case-studie 1 illustrerer hvordan resultatene fra CO2 utslippsanalysen og FEM analysen kan brukes til å velge bygningsdesign og materialer. Premisset for case-studie 2 er å undersøke tre ulike bygninger som har de samme kravene, for å fremheve hvordan de ulike designene og materialene påvirker CO2 utslippet. Case-studie 3 og 4 bruker begge optimalisering til å finne det beste designet for ulike typer bygninger, med tanke på CO2 utslipp og utnyttelse. Funnene i de fire case-studiene viser at denne løsningen har stort potensiale. De fleste resultatene ser lovende ut, men algoritmen krever noe justering, i tillegg til videre utvikling, for å kunne gi tilfredsstillende resultater.A significant amount of the emissions from the construction industry is related to the production and use of materials. Because of this, there is an urgent need to create solutions that can drastically reduce these emissions without affecting the integrity of the structures. The technological advancement of the industry has made it possible to create such solutions, and with the possibility to automate parts of the design process, the solutions could also contribute to making the process more effective. This master's thesis proposes a possible solution to reducing the environmental impact of buildings through the implementation of automation in the design process. The foundation of this solution is a plug-in created in Visual Studios 2022, containing components that automates the design of the buildings. The components are created using the programming language C#. The plug-in is imported to Grasshopper, where the components are used to create algorithms for a CO2 emission analysis and a FEM analysis, which are used to analyze elements in different buildings in terms of CO2 emission and utilization. The results, as well as the input for the components that creates the buildings, are connected to an optimization component that iterates through several different building designs and material choices, before presenting the best possible alternative. As a result, this solution provides more sustainable building designs while reducing the manual, time consuming work. The four case studies in this thesis highlights different aspects of the solution. Case study 1 illustrates how the results from the CO2 emission analysis and FEM analysis can be used to decide which designs and materials to choose. The premise of case study 2 is to investigate three different building designs with the same requirements, to highlight the differences the designs and material choices make, in terms of CO2 emission. Case study 3 and 4 both uses optimization to find the best design for different types of buildings, in terms of both CO2 emission and utilization. The findings from the four case studies show that this solution has great potential. Most of the results are very promising, however the algorithms require some work, as well as some further development, to provide satisfactory results

    Value of information : reliability of 3D reflection seismology in exploration

    Get PDF
    Decision makers who face uncertain prospects often gather information with the intention of reducing uncertainty. If we can reduce uncertainty about future outcomes, then we can make choices that give us a better chance at a good outcome. At least it is so in a perfect world where information is free and indicates the outcome of the uncertain event with certainty. In the real world business nothing comes for free and information has a cost. Therefore we should investigate the benefits of the new information before spending time and money to collect it. Value of information, VoI, is a decision-analytic tool used for this purpose. Schlafer (1959) was the first to discuss it in a general context and Grayson (1960) applied it to the oil-and gas industry. It has grown in use during the last years. In this work the VoI-concept is described with emphasis on the question of shooting a 3D seismic-survey before drilling a wildcat or not. This is a question often encountered in the petroleum-industry. The different steps in the working-process are described and Bayes’ Theorem is introduced for probability updating. The reliabilities needed in this calculation and how to assess them is the main part of this work. An overview of how this is done in previous publications is made, and the different approaches are discussed. Then a model is developed to aid in the assessment of reliabilities for seismic data gathering. The model is closely linked to the data acquired in the 3D seismic survey and to the properties of the actual prospect. The reliability assessment-model is subjective, but let the experts express their knowledge through weights and the degree of presence of prospect-properties instead of probabilities. This is directly related to their professional and technical skills. Probability is not

    Barn som pårørende

    Get PDF

    Klær og moter i Sogn på 1970-tallet

    Get PDF
    Bacheloroppgave i Historie SA 523 201

    Smidig i historiske fotspor : Et komparativt studie av to etablerte virksomheter som utforsker samspillet mellom strategi, struktur og smidig som respons på et næringsliv som endres gjennom digital utvikling

    Get PDF
    Vår forskningsstudie har undersøkt hvordan to etablerte virksomheter responderer på digitale og markedsmessige utfordringer gjennom å utforske samspillet mellom strategi, struktur og smidig belyst fra ledernes persepsjoner. Problemstillingen vi har undersøkt er: Hvordan navigerer ledere i etablerte virksomheter i samspillet mellom strategi, struktur og smidig i respons til omgivelser i kontinuerlig endring? For studien er det anvendt en eksplorativ, komparativ caseanalysemetode. Vi har undersøkt to etablerte virksomheter, som begge har erfaring med digital utvikling, og som har demonstrert tilpasningsevne på organisatorisk nivå gjennom å implementere storskala smidig metodikk og multiteamsystemer. Tidligere forskning har i begrenset omfang adressert interaksjonen mellom strategi, struktur og smidighet under organisatoriske transformasjonsprosesser i etablerte, norske virksomheter. Forskningen vår har endt ut i en triadisk modell for økt konkurranseevne i en rask skiftende digital verden. Denne modellen belyser hvordan samspillet mellom strategi, struktur og smidig har påvirket den organisatoriske transformasjonen. Vi har identifisert flere komponenter som er kritisk for å få dette samspillet til å fungere: 1) Drivere for endring, 2) Strategisk kompetanseutvikling, 3) Balansert autonomi, 4) Evolusjon i lederrollen, og 5) Felles språk og verdier. Gjennom vår analyse har vi også avdekket at ledere i disse virksomhetene har innlemmet et utvalg av operasjonelle implementeringsmekanismer for effektiv fremdrift av de organisatoriske transformasjonsprosessene. Disse er kompetanseveksling, prioriteringsverktøy, produktstrategi, ordbok, samhandlingsmodell, kulturdokument, nye roller og nye læringsarenaer. Dette understreker både et strategisk og et taktisk behov for å se samspillet mellom strategi, struktur og smidig som respons til omgivelser i kontinuerlig endring. Videre forskning bør bygge på vår modell for å utforske variasjonen i hvordan ulike organisasjoner håndterer denne transformasjonen, noe som vil berike det akademiske perspektivet fremover.Our research study has investigated how two established companies respond to digital and market challenges by exploring the interplay between strategy, structure, and agility as illuminated by leaders' perceptions. The research question we have examined is: How do leaders in established businesses navigate the interplay between strategy, structure, and agility in response to continuously changing environments? The study employed an exploratory, comparative case analysis method. We have investigated two established businesses, both of which have experience with digital development and have demonstrated organizational adaptability by implementing large-scale agile methodology through multi-team systems. Previous research has only to a limited extent addressed the interaction between strategy, structure, and agility during organizational transformation processes in established Norwegian businesses. Our research has culminated in a triadic model for increased competitive advantage in a digital world, highlighting how the interplay between strategy, structure, and agility has impacted organizational transformation. We have identified several components critical for making this interplay work: 1) Drivers for change, 2) Strategic competence development, 3) Balanced autonomy, 4) Evolution of leadership roles, and 5) Common language and values. Through our analysis, we have also revealed that leaders in these companies have incorporated a range of operational implementation mechanisms to effectively advance organizational transformation processes. These are competence exchange, prioritization tools, product strategy, glossary, interaction model, culture document, new roles, and new learning arenas. This emphasizes both a strategic and tactical need to view the interplay between strategy, structure, and agility in response to continuously changing environments. Further research should build on our model to explore the variation in how different organizations manage this transformation, which will enrich the academic perspective.nhhma

    Smidig i historiske fotspor : Et komparativt studie av to etablerte virksomheter som utforsker samspillet mellom strategi, struktur og smidig som respons på et næringsliv som endres gjennom digital utvikling

    Get PDF
    Our research study has investigated how two established companies respond to digital and market challenges by exploring the interplay between strategy, structure, and agility as illuminated by leaders' perceptions. The research question we have examined is: How do leaders in established businesses navigate the interplay between strategy, structure, and agility in response to continuously changing environments? The study employed an exploratory, comparative case analysis method. We have investigated two established businesses, both of which have experience with digital development and have demonstrated organizational adaptability by implementing large-scale agile methodology through multi-team systems. Previous research has only to a limited extent addressed the interaction between strategy, structure, and agility during organizational transformation processes in established Norwegian businesses. Our research has culminated in a triadic model for increased competitive advantage in a digital world, highlighting how the interplay between strategy, structure, and agility has impacted organizational transformation. We have identified several components critical for making this interplay work: 1) Drivers for change, 2) Strategic competence development, 3) Balanced autonomy, 4) Evolution of leadership roles, and 5) Common language and values. Through our analysis, we have also revealed that leaders in these companies have incorporated a range of operational implementation mechanisms to effectively advance organizational transformation processes. These are competence exchange, prioritization tools, product strategy, glossary, interaction model, culture document, new roles, and new learning arenas. This emphasizes both a strategic and tactical need to view the interplay between strategy, structure, and agility in response to continuously changing environments. Further research should build on our model to explore the variation in how different organizations manage this transformation, which will enrich the academic perspective.nhhma

    Association between copy number variations in the OCA2-HERC2 locus and human eye colour

    Get PDF
    Human eye colour variation is strongly associated with single nucleotide polymorphisms (SNPs) in the OCA2- HERC2 locus, especially rs12913832 that is found in an enhancer element of OCA2. In a previous study we found that 43 out of 166 individuals in a Norwegian population with the brown eye colour genotype HERC2 rs12913832:AA or AG, did not have the expected brown eye colour. To investigate if duplications or deletions in the OCA2-HERC2 locus could explain the blue eye colour in these individuals, we analysed massively parallel sequencing (MPS) data for copy number variations (CNVs) in the OCA2-HERC2 region. The ~500 kb long OCA2- HERC2 locus was sequenced in 94 individuals with the rs12913832:AG and AA genotypes. Of these, 43 were observed to have blue eye colour and 51 were observed to have brown eye colour. CNVs were analysed using R and the R-package panelcn.MOPS - CNV detection tool for targeted NGS panel data. In rs12913832:AG individuals, CNVs in 32 regions were significantly associated with blue eye colour (Benjamini-Hochberg adjusted pvalue ≤ 0.05). In rs12913832:AA individuals, CNVs in 14 regions were associated with blue eye colour using raw p-values (p ≤ 0.05). The functional effects of these CNVs on OCA2 expression are yet to be investigated. However, this study suggests that CNVs in the OCA2-HERC2 locus might explain why some of the rs12913832:AG and AA individuals have unexpectedly blue eyes

    Dimensjonering av betongdekket til Hamnevegen bru

    Get PDF
    I bacheloroppgaven vår ønsket vi å dimensjonere en betongbru. Etter samtaler med intern veileder i oktober/november 2021, bestemte vi oss for å se på Hamnevegen bru i Verdal, og for å begrense oppgaven har vi valgt å kun dimensjonere betongdekket. Vi ønsket å dimensjonere med både slakkarmering og spennarmering, for å finne den beste løsningen for den valgte konstruksjonen. Planlegging av oppgaven samt nødvendig forarbeid, startet i januar 2022. I mars, etter to måneder med fokus på forarbeid, begynte vi å jobbe med selve oppgaven ved å definere statikken til brua, etterfulgt av å finne ulike relevante lasttilfeller. Vanligvis benyttes et dataprogram, som simulerer ulike lasttilfeller for å finne den verste lastkombinasjonen for konstruksjonen. Siden vi ikke har hatt tilgang til et slikt dataprogram, har nesten alle beregninger blitt gjort for hånd. Unntaket er Focus Konstruksjon – et FEM-program som ble brukt til å modellere utvalgte lasttilfeller for å finne moment- og skjærkraftdiagrammer. Resultatene fra Focus har blitt kontrollregnet for hånd for å verifisere resultatene. Kravene gitt i Eurokode og Statens Vegvesen har endret seg siden brua ble bygget på 1970-tallet. I tillegg har trafikklastene økt, og bygg- og anleggsbransjen har blitt stadig mer digitalisert. I oppgaven vår har vi valgt å kun forholde oss til kravene gitt i Eurokode. Vi har i tillegg gjort flere forenklinger, for å kunne utføre de nødvendige beregningene for hånd. På bakgrunn av dette vil våre resultater avvike noe fra hva man kom frem til på 70-tallet. Som en forenkling, ble betongdekket sett på som to separate og uavhengige bjelker; én i lengderetningen og én i tverretningen. Beregningene av den ene bjelken kunne derfor utføres uten å ta hensyn til den andre. Begge retningene ble kontrollert i både bruddgrensetilstanden og bruksgrensetilstanden, og nødvendig slakkarmering og spennarmering ble beregnet i henhold til kravene gitt i Eurokode 2. For beregningene i bruksgrensetilstanden, dvs. nedbøying, riss og spenninger, ble «Betongkonstruksjoner» av Sørensen brukt som støttelitteratur, i tillegg til Eurokode 2. Basert på beregningene våre, med flere forenklinger, konkluderte vi med at bjelken i lengderetningen burde være spennarmert, mens slakkarmering var det beste alternativet for bjelken i tverretningen. Dette er gunstig fordi man unngår at brudekket blir sterkt overdimensjonert, i tillegg til at det blir enklere å bygge konstruksjonen i praksis, med tanke på at den egentlig er en plate. Beregningene i bruksgrensetilstanden viste at nedbøyingen i to av feltene på bjelken i lengderetningen ikke var innenfor kravene når vi dimensjonerte med slakkarmering. For dimensjonering med spennarmering var maksimal nedbøying innenfor kravene for alle feltene. Rissberegningene var innenfor kravet for slakkarmering, men strekkspenningene var over tillatt verdi. Beregning av spenninger for spennarmering i Stadium I, uopprisset tverrsnitt, viste at strekkspenningene var under maksimalt tillatt verdi. Derfor valgte vi å ikke regne på Stadium II, opprisset tverrsnitt, for å begrense oppgaven. For skjærarmering, valgte vi å bruke skjærbøyler 2ϕ16, selv om beregningene er basert på 2ϕ10, ettersom ϕ16 er mer vanlig å bruke som diameter for brukonstruksjoner. Sluttrapporten gir en mer detaljert beskrivelse av dimensjoneringen av brua, i tillegg til forklaringer på alle forenklingene som har blitt gjort. I slutten av rapporten reflekteres det rundt valgene våre, og på hvilken måte de har påvirket beregningene og resultatene. Dessuten har vi reflektert over hva som kunne blitt gjort annerledes, med tanke på å få et mer nøyaktig resultat. Vi har også nevnt ulike faktorer som har blitt sett bort ifra for å begrense omfanget av oppgaven. Vanligvis er dette faktorer som blir tatt med i beregningene, og derfor har vi inkludert en kort refleksjon over hvilke påvirkninger disse har på oppgaven.In our bachelor thesis, we wanted to do a structural design of a concrete bridge. After conversations with our internal supervisor in October/November 2021, we decided to go for Hamnevegen bridge in Verdal. To narrow down the thesis, we decided to dimension only the concrete slab of the bridge, but we wanted to do the structural design with both normal and prestressed reinforcement to determine the best alternative for this construction. Planning of the project and necessary preliminary work began in January 2022 and lasted for two months. Then, in March, we started the design process by defining the statics of the construction, followed by studying different relevant load cases. A computer program is usually used for load calculations by simulating different load cases to find the worst combination of loads for the construction. Not having access to this kind of program, all calculations regarding the structural design have been done almost without the help of computer programs. The only exception is Focus Construction – a FEM (finite element method) program used for modelling the selected load cases giving out the bending moment and shear force diagrams. The results from Focus have been verified by manual calculations. The requirements given in Eurocode and by Statens Vegvesen have changed since the bridge was built in the 1970’s, traffic loads have increased, and the construction industry has gone digital. In our thesis, we have chosen to follow only the requirements given in Eurocode. We have, in addition, made various simplifications in order to make it possible to do the necessary calculations by hand. The combination of all these factors may cause differences between our results and the original results from the 70’s. As a simplification, we pictured the concrete slab as two separate and independent beams, one in the longitudinal direction and the other transversal. Calculations of one beam were made regardless of the other. Both the longitudinal and the transversal beam were checked for the ultimate limit state (ULS) and the serviceability limit state (SLS). Necessary reinforcement, both normal and prestressed, was calculated according to the requirements given by Eurocode 2. For the SLS-calculations, i.e. deflection, cracking and stresses, “Betongkonstruksjoner” by Sørensen was frequently used as supporting literature in addition to Eurocode 2. Based on our calculations with multiple simplifications, we concluded that the longitudinal beam should be prestressed, while normal reinforcement was considered the best option for the transversal beam. This is favorable not only to avoid a strongly oversized quantity of reinforcement, but also regarding practical reasons for building the construction, which in reality is a slab. The SLS-calculations showed that the maximum deflection requirement was not satisfied for two of the spans for the longitudinal beam using normal reinforcement, while all the deflection requirements were within the limits for prestressed reinforcement. Cracking was considered within the limits for normal reinforcement, but the tension stresses were above the allowed value. Calculation of stresses for prestressed reinforcement in stage I, uncracked section, proved that the tension stresses were below the maximum allowed value. Therefore, stage II, cracked section, has not been calculated to limit our project. As shear reinforcement, we decided to use shear links 2ϕ16, although the calculations were based on 2ϕ10, since ϕ16 is a more common shear diameter for bridge constructions. The final report presents a more detailed description of the methodical design of the bridge, as well as explanations of all the simplifications that have been made. At the end of the report, we have reflected upon our choices and in what way they have influenced our calculations and results. Furthermore, we have reflected on what could have been done differently in order to get a more precise result. We have also mentioned various factors that have been disregarded to limit our thesis. Normally, these factors would have been taken into consideration. A brief reflection on their influence has therefore been included

    Association between Variants in the OCA2-HERC2 Region and Blue Eye Colour in HERC2 rs12913832 AA and AG Individuals

    Get PDF
    The OCA2-HERC2 region is strongly associated with human pigmentation, especially eye colour. The HERC2 SNP rs12913832 is currently the best-known predictor for blue and brown eye colour. However, in a previous study we found that 43 of 166 Norwegians with the brown eye colour genotype rs12913832:AA or AG, did not have the expected brown eye colour. In this study, we carried out massively parallel sequencing of a ~500 kbp HERC2-OCA2 region in 94 rs12913832:AA and AG Norwegians (43 blue-eyed and 51 brown-eyed) to search for novel blue eye colour variants. The new candidate variants were subsequently typed in a Norwegian biobank population (total n = 519) for population specific association analysis. We identified five new variants, rs74409036:A, rs78544415:T, rs72714116:T, rs191109490:C and rs551217952:C, to be the most promising candidates for explaining blue eye colour in individuals with the rs12913832:AA and AG genotype. Additionally, we confirmed the association of the missense variants rs74653330:T and rs121918166:T with blue eye colour, and observed lighter skin colour in rs74653330:T individuals. In total, 37 (86%) of the 43 blue-eyed rs12913832:AA and AG Norwegians could potentially be explained by these seven variants, and we suggest including them in future prediction models
    corecore