Archaeologists with the Vejle Museums have unearthed a 1,600-year-old weapon offering
a chainmail as well as two fragments of a Roman helmet
Elias Witte Thomasen uncovering the massive weapon offering at the site of Løsning Søndermark
“During the examination of one of the largest weapon deposits — offered in a posthole of a house — two unusual iron plates were unearthed
both roughly the size of a palm,” Vejle Museums archaeologist Elias Witte Thomasen and colleagues said in a statement
conservators and archaeologists were able to see beneath the thick layers of rust surrounding the objects.”
“The results revealed an extremely rare find — the remains of a Roman helmet.”
this find is the only known Roman helmet ever found in Denmark and the earliest iron helmet found in the country
“The two plates consist of a neck plate and a decorated cheek plate from a so-called crest helmet
a type used in the Roman Empire in the 4th century CE,” the researchers said
“Roman helmet finds from the Iron Age are exceptionally rare in southern Scandinavia
and there are no direct parallels to this discovery.”
“The few similar finds come from Thorsbjerg Moor in Schleswig or from southern Sweden and Gotland-none from Denmark.”
X-ray image of neck and cheek guard from the Roman helmet
The discovery raises a significant question: why were only the neck plate and one cheek plate discovered
“The answer lies in the nature of post-battle rituals during this period of the Iron Age
where weapons and military equipment were rarely deposited intact,” Dr
“Spearheads were separated from their shafts
and equipment was destroyed and divided among the parties involved in the conflict.”
“The missing cheek plate and helmet bowl were likely distributed elsewhere.”
“The helmet may have belonged to a Germanic warlord who served in the Roman auxiliaries
bringing his personal equipment home after his service ended,” he said
it could have been looted from a Roman legionary in battles closer to the Empire’s Germanic frontier and later brought to Jutland.”
“Both on its own and in relation to the broader deposition of weapons and military gear at the site
the helmet provides valuable insights into the military elite of the Iron Age and their connections to the powerful southern neighbor — the Roman Empire.”
A team of archaeologists uncovered a trove of weapons
in advance of a motorway being built on the site
The deposit is a weapon sacrifice—literally
an Iron Age community deposited over 100 lances
the deposit is a compelling look into the social and military wheelings and dealings of the group that inhabited the region
the archaeological team concluded that the site was not a weapons workshop or a barracks—settings where piles of weapons would not be out of place
“From the very first surveys, we knew this was going to be extraordinary, but the excavation has exceeded all our expectations,” said Elias Witte Thomasen, an archaeologist at The Vejle Museums and leader of the excavation, in a museums release
“The sheer number of weapons is astonishing
but what fascinates me most is the glimpse they provide into the societal structure and daily life of the Iron Age
We suddenly feel very close to the people who lived here 1,500 years ago.”
The exact count of the metal objects found at the site is as follows: 119 lances and spears
fragments of at least two oath rings and a bugle
The rings were bracteates: bronze medallions which would be worn around the neck and often expressed the wearers’ political or military allegiance
The bracteates were etched with a design reminiscent of the chainmail found at the site
the chainmail is the first discovered on a settlement in southern Scandinavia
The team also recovered ceramics and flint objects; by their reckoning
the settlement held significant social and economic influence in the area
allowing it to send out military campaigns to the surrounding area
The team is not sure whether the weaponry belonged to the locals who buried it or was booty recovered from a defeated enemy, though further excavations and analysis of the weapon trove may provide answers. War booty was common in central Europe during the first few centuries CE. According to the National Museum of Denmark
20 Danish bogs contained Iron Age weaponry with “traces of ritual destruction.” Those weapons may have been offerings to the gods
but instead chose to hide them in a posthole
the weaponry was deposited during the house’s deconstruction
The team deduced that because the weapons were placed in a hole left by a post that held up the structure’s roof
The hole was then backfilled—so it was not the most convenient place to store a sword you want easy access to
including one curious burial of a woman in a wagon
According to the release, scientific findings about the buried goods will be published on the museums’ website
and some of the items may be on display at the Vejle Cultural Museum as soon as early 2025
' + scriptOptions._localizedStrings.webview_notification_text + '
" + scriptOptions._localizedStrings.redirect_overlay_title + "
" + scriptOptions._localizedStrings.redirect_overlay_text + "
LBV Magazine English Edition
a chieftain buried an arsenal of weapons sufficient to equip a small army in two structures located northwest of Hedensted
Archaeologists have just discovered it in Løsning Søndermark
which includes an impressive quantity of weapons
has been cataloged as one of the most significant in recent times
The way these elements were deposited suggests that it was a ritual offering
possibly dedicated to the gods in search of protection or divine favors
The chainmail discovered in Løsning Søndermark is one of the few found in southern Scandinavia
What makes this discovery particularly remarkable is that
unlike other specimens found in tombs or bogs
this chainmail was found in the context of a settlement
Crafting such a garment required a high degree of skill
meaning that only members of the warrior elite could own one
the piece is being carefully preserved in a conservation center to ensure its study and future exhibition
One of the most surprising finds of this excavation was the discovery of two iron plates
turned out to be fragments of a Roman helmet
They were identified as the neck guard and one of the decorated cheek guards of a crested helmet
a model characteristic of the 4th century in the Roman Empire
This type of find is extremely rare in Scandinavia
no direct parallels have been found in Denmark
increasing the significance of the discovery
Among the unearthed objects, fragments of two bronze rings were also found, similar to those depicted on the gold bracteates of the Vindelev treasure
These rings were used as symbols of authority and commitment by tribal leaders
The find suggests that the chieftain who owned them held significant status within his community
as these objects are closely linked to leadership and sworn loyalty
including the fragments of the Roman helmet
will be on display at the Vejle Cultural Museum
Vejle Museerne
Subscribe to get the latest posts sent to your email
Archaeologists from universities in the United States and Denmark found
deep within the Actun Uayazba Kab cave in Belize
two small stone tools dated between 250 and 900 AD that…
men and women gathered to play a game called Cuju
A team of researchers has succeeded in recreating for the first time in a laboratory experiment a phenomenon that until now only existed as a theory in the realm of…
the Cantonal Archaeology of Aargau carried out a rescue excavation between early May 2024 and the end of March 2025
The Egyptian archaeological mission affiliated with the Supreme Council of Antiquities announced the discovery of a group of defensive structures
and a system of moats that could indicate…
In the southeastern area of the city of Rome
archaeologists excavating inside the Triton Baths
within the monumental complex of the Villa di Sette…
Why did some animals from ancient eras become fossils
while others simply disappeared without a trace
A recent study on the cave paintings of the Altamira Cave in Santillana del Mar
Cantabria (Spain) has concluded that some of the artworks it contains could be much older…
A team of paleontologists from the University of Leicester has managed to decipher one of the many enigmas of the dinosaur era—the exact moment when pterosaurs
Rome achieved numerous military victories that allowed it to grow
and dominate nearly the entire known world in Antiquity
Receive our news and articles in your email for free
You can also support us with a monthly subscription and receive exclusive content
This website is using a security service to protect itself from online attacks
The action you just performed triggered the security solution
There are several actions that could trigger this block including submitting a certain word or phrase
You can email the site owner to let them know you were blocked
Please include what you were doing when this page came up and the Cloudflare Ray ID found at the bottom of this page
Medievalists.net
Archaeologists in Denmark have uncovered a remarkable Iron Age weapon deposit near Hedensted
and a unique chainmail—now joined by another extraordinary discovery: fragments of a Roman helmet
conducted by The Vejle Museums as part of the expansion of a highway
has uncovered one of the largest known weapon sacrifices in Scandinavia
the deposit includes 119 lances and spears
and a rare chainmail—an elite piece of armour requiring immense skill and resources to produce
These weapons and military artifacts were not merely lost or abandoned; they were carefully buried within two houses
weapons were placed in postholes when the structure was dismantled
they were packed around roof-bearing posts before the house was even completed
Such placements hint at an offering tied to the residence of a powerful local chieftain
possibly to secure divine favour or commemorate a military victory
The already extraordinary find became even more remarkable with the identification of two iron plates discovered in a posthole
X-ray analysis revealed that they belonged to a Roman crest helmet from the 4th century
This marks the first time a Roman helmet has been found in Denmark
Roman helmets are exceptionally rare in southern Scandinavia
with the closest parallels found in Thorsbjerg Moor in Schleswig and parts of Sweden
The find raises intriguing questions: How did a Roman helmet end up in Jutland
bringing his personal equipment home after his service ended
it could have been looted from a Roman legionary in battles closer to the empire’s Germanic frontier and later brought to Jutland.”
the helmet was deliberately destroyed before being deposited
Only a decorated cheek plate and a neck plate survived
This aligns with known post-battle rituals in the Iron Age
where weapons were systematically dismantled before being sacrificed
the excavation uncovered fragments of at least two bronze neck rings
resembling those seen in the Vindelev Hoard and other depictions of Iron Age rulers
associated with rulers and warriors who swore loyalty to their leaders
indicates that the site was home to an elite warrior class with connections beyond Denmark—possibly even to the Roman Empire
The Løsning Søndermark site was continuously inhabited for nearly 500 years
home to influential individuals who commanded warriors and participated in military campaigns
The vast number of weapons uncovered provides tangible evidence of such conflicts
archaeologists hope to determine whether the weapons belonged to local warriors or were taken as war booty from defeated enemies
The latter would align with known South Scandinavian practices
where war spoils were often ritually deposited in wetlands
This discovery has already drawn international attention
briefly putting the small Jutland town of Løsning in the global spotlight
Starting February 8th parts of the weapon deposit
will be displayed at the Vejle Cultural Museum
this find promises to shed new light on the military
and religious dynamics of Iron Age Scandinavia
highlighting the region’s deep ties to the wider European world
For updates on the findings and their significance, visit vejlemuseerne.dk
Top Image: Sword and other weapons – Photo: Vejle Museums
We've created a Patreon for Medievalists.net as we want to transition to a more community-funded model
We aim to be the leading content provider about all things medieval
podcast and Youtube page offers news and resources about the Middle Ages
We hope that are our audience wants to support us so that we can further develop our podcast
and remove the advertising on our platforms
This will also allow our fans to get more involved in what content we do produce
By continuing to browse this website, or closing this message, you agree to our use of cookies
Business is growing and in 2020 we expanded our headquarters in Hedensted
Our total facilities in Hedensted now include a total area of 11.300 square meters divided between R&D facilities and manufacturing
The expansion accommodates an increasing demand for the technologies and solutions we develop
During 2020 Eltronic Wind Solutions also expanded our business in China
At our location in Taicang we have facilities and highly skilled employees for developing and building prototypes
and training of a wide variety of equipment
Digital transformation is also changing the way we do business at Eltronic Wind Solutions
We are very excited about our new digital showroom for LIVE presentations of our equipment and solutions
Via Augmented Reality (AR technology) we are able to demonstrate the features and functions of the products and the digital setup will give our customers an “on-site” experience
Eltronic Wind Solutions is a solution provider focused on increasing the safety
the efficiency and improving the competitiveness of our customer’s business
Eltronic Wind Solutions is your leading strategic partner when it comes to innovative equipment solutions for the entire life cycle of wind turbines both onshore and offshore – including concept design
Eltronic Wind Solutions A/S is a part of Eltronic Group with more than 500 highly professional engineers
specialists and technicians covering competencies within a wide range of disciplines
About . Contact . Donation
The finds were made at Løsning Søndermark during the expansion of the E45 motorway
The discovery includes nearly 200 weapons and items that would have equipped a small army
The arsenal comprises 119 lances and spears
and a rare chainmail shirt—one of only 14 ever found in Denmark and the first recovered from a settlement rather than a burial site
Other notable artifacts include fragments of two bronze “oath rings,” symbols of power and influence in the Nordic Iron Age
an archaeologist from the Vejle Museums who led the excavation
we knew this was going to be extraordinary
but the excavation has exceeded all our expectations
The sheer number of weapons is astonishing
We suddenly feel very close to the people who lived here 1,500 years ago.”
The artifacts were deliberately buried within two houses in the settlement
One deposit was made during the construction of a house
with weapons packed around its load-bearing posts
while the other was made during a house’s demolition
with weapons placed in the empty postholes
These practices suggest the offerings were part of ceremonial or sacrificial rituals linked to the residence of a powerful chieftain rather than items from a workshop or barracks
Archaeologists believe the weapons were likely spoils of war
as such practices of collecting and sacrificing war booty were common during the South Scandinavian Iron Age
further analysis will determine whether the items were locally produced or acquired from defeated enemies
an expensive and labor-intensive piece of armor
underscores the chieftain’s elite status
Such a garment was typically reserved for high-ranking warriors or leaders
and its sacrifice would have been a significant display of wealth and power
the chainmail and other items bear symbolic value
such as those from the nearby Vindelev Hoard
which feature figures holding rings as symbols of authority
typically found in bogs or funerary contexts
this discovery provides a rare glimpse into life within a settlement
The site’s unique nature adds valuable context to the understanding of social and military structures in Denmark during the waning days of the Roman Empire
The excavation leader Thomasen remarked on this broader significance: “The finds allow us to connect the dots between daily life
This site is a key to understanding the resilience and influence of Iron Age societies on the northern edge of the Roman Empire.”
including their origins and production methods
This research is expected to enhance understanding of the warriors
and the intricate social networks of the Iron Age
More information: Vejle Museums
and website in this browser for the next time I comment
Δdocument.getElementById("ak_js_1").setAttribute("value",(new Date()).getTime())
Learn how to describe the purpose of the image (opens in a new tab)
Leave empty if the image is purely decorative
Learn how to describe the purpose of the image (opens in a new tab). Leave empty if the image is purely decorative.
Volume 6 - 2019 | https://doi.org/10.3389/fmed.2019.00034
For over a decade the term “Big data” has been used to describe the rapid increase in volume
variety and velocity of information available
not just in medical research but in almost every aspect of our lives
we now have the capacity to rapidly generate
“Big data” no longer means what it once did
The term has expanded and now refers not to just large data volume
but to our increasing ability to analyse and interpret those data
Tautologies such as “data analytics” and “data science” have emerged to describe approaches to the volume of available information as it grows ever larger
New methods dedicated to improving data collection
processing and interpretation continue to be developed
Exploiting new tools to extract meaning from large volume information has the potential to drive real change in clinical practice
from personalized therapy and intelligent drug design to population screening and electronic health record mining
where new technology promises “Big Advances,” significant challenges remain
Here we discuss both the opportunities and challenges posed to biomedical research by our increasing ability to tackle large datasets
Important challenges include the need for standardization of data content
a heightened need for collaborative networks with sharing of both data and expertise and
a need to reconsider how and when analytic methodology is taught to medical researchers
We also set “Big data” analytics in context: recent advances may appear to promise a revolution
sweeping away conventional approaches to medical science
their real promise lies in their synergy with
data-driven hypotheses based on interpretable models will always require stringent validation and experimental testing
hypothesis-generating research founded on large datasets adds to
Each can benefit from the other and it is through using both that we can improve clinical practice
Big data analyses can be used to ask novel questions
with conventional experimental techniques remaining just as relevant for testing them
The synergistic cycle of hypothesis-driven and data-driven experimentation
The development of Big data approaches has greatly enhanced our ability to probe which “parts” of biology may be dysfunctional. The goal of precision medicine aims to take this approach one step further, by making that information of pragmatic value to the practicing clinician. Precision medicine can be succinctly defined as an approach to provide the right treatments to the right patients at the right time (7)
The challenge of reducing biology to its component parts
then identifying which can and should be measured to choose an optimal intervention
the patient population that will benefit and when they will benefit most cannot be overstated
Big data approaches promises to help us reach this aspirational goal
In this review we summarize a number of the key challenges in using Big data analysis to facilitate precision medicine. Technical and methodological approaches have been systemically discussed elsewhere and we direct the reader to these excellent reviews (8–10)
Here we identify key conceptual and infrastructural challenges and provide a perspective on how advances can be and are being used to arrive at precision medicine strategies with specific examples
The concept of Big data in medicine is not difficult to grasp: use large volumes of medical information to look for trends or associations that are not otherwise evident in smaller data sets
So why has Big data not been more widely leveraged
What is the difference between industries such as Google
Netflix and Amazon that have harnessed Big data to provide accurate and personalized real time information from on line searching and purchasing activities
Analysis of these successful industries reveals they have free and open access to data
which are provided willingly by the customer and delivered directly and centrally to the company
These deep data indicate personal likes and dislikes
enabling accurate predictions for future on-line interactions
Is it possible that large volume medical information from individual patient data could be used to identify novel risks or therapeutic options that can then be applied at the individual level to improve outcomes
representing deep private personal information
is carefully guarded and not openly available; data are usually siloed in clinic or hospital charts with no central sharing to allow the velocity or volume of data required to exploit Big data methods
Medical data is also complex and less “usable” compared with that being provided to large companies and therefore requires processing to provide a readily usable form
The technical infrastructure even to allow movement
manipulation and management of medical data is not readily available
major barriers exist in the access to data
which are both philosophical and practical
To improve the translation of existing data into new healthcare solutions
the collection and standardization of heterogeneous datasets
prior informed consent for use of de-identified data
and the ability to provide these data back to the healthcare and research communities for further use
This real time information is primarily used to predict future trends (predictive modeling) without trying to provide any reasons for the findings
A more immediate target for Big data is the wealth of clinical data already housed in hospitals that help answer the question as to why particular events are occurring
if they could be integrated and analyzed appropriately
to give insights into the causes of disease
plus the development of future drugs and interventions
To assimilate this data will require massive computing far beyond an individual's capability thus fulfilling the definition of Big data
The data will largely be derived from and specific to populations and then applied to individuals (e.g.
patient groups with different disease types or processes provide new insights for the benefit of individuals)
and will be retrospectively collected rather than prospectively acquired
while non-medical Big data has largely been incidental
at no charge and with low information density
the Big data of the clinical world will be acquired intentionally
costly (to someone) with high information density
This is therefore more akin to business intelligence which requires Big data techniques to derive measurements
and to detect trends (not just predict them)
which are otherwise not visible or manageable by human inspection alone
The clinical domain has a number of similarities with the business intelligence model of Big data which potentially provides an approach to clinical data handling already tested in the business world
data are both structured and non-structured and technologies to deal with both will be required to allow easy interpretation
this allows the identification of new opportunities and development of new strategies which can be translated clinically as new understanding of disease and development of new treatments
Big data provides the ability to combine data from numerous sources both internal and external to business; similarly
etc.) may be combined in the clinical domain and provide “intelligence” not derived from any single data source
A central data warehouse provides a site for integrating this varied data allowing curation
Currently such centralized repositories do not commonly exist in clinical information technology infrastructure within hospitals
These data repositories have been designed and built in the pre-Big data era being standalone and siloed
with no intention of allowing the data to be combined and then analyzed in conjunction with various data sets
There is a need for newly installed information technology systems within clinical domains to ensure there is a means to share data between systems
processing of “data concerning health,” “genetic data,” and “biometric data” is prohibited unless one of several conditions applies: data subject gives “explicit consent,” processing is necessary for the purposes of provision of services or management of health system (etc.)
or processing is necessary for reasons of public interest in the area of public health
Within the GDPR, the data subject also has the “right to be forgotten”: he/she can withdraw the consent, after which the data controller needs to remove all his/her personal data (18). Because of all issues around data sharing, scientists might consider (whenever possible) to share only aggregated data which cannot be traced back to individual data subjects, or to raise the abstraction level, sharing insights instead of data (19)
While the implementation of GDPR has brought this issue into sharp focus
we face an increasingly urgent need to balance the opportunity Big data provides for improving healthcare
against the right of individuals to control their own data
It is our responsibility to only use data with appropriate consent
but it is also our responsibility to maximize our ability to improve health
Balancing these two will remain an increasing challenge for all precision medicine strategies
FAIR principles for data management and stewardship
healthcare institutions are well equipped with information technology
this has been designed to support the clinical environment and billing
but not the research environment of Big data
Exploitation of this new research environment will require a unique environment to store
Clinical systems are built to isolate different data sets such as imaging
whereas the Big data domain requires the integration of data
The EHR may provide some of this cross referencing of unstructured data but does not give the opportunity for deriving more complex data from datasets such as imaging and pathology which gives the opportunity for further analysis beyond the written report
a data warehouse provides a “third space” for housing diverse data that normally resides elsewhere
This allows the handling of multiple individuals at the same time
grouped by some common feature such as disease type or imaging appearance
which is the opposite of a clinical system which usually is interested in varied data from one patient at a time
A data warehouse allows secondary handling to generate cleaner
more information-rich data as seen when applying annotations and segmentation in pathological and radiological images
the data warehouse needs to provide the interface with multiple software applications
high volume data that can then undergo various pre-processing techniques in readiness for the application of Big data techniques including artificial intelligence and machine learning
The latter needs specialized high-powered computing to achieve rapid processing
Graphic processing units (GPUs) allow the handling of very large batches of data
and undertake repetitive operations that can accelerate processing by up to a hundred times compared to standard central processing units (CPU's)
current data handling systems are not yet equipped with these processors requiring upgrading of hardware infrastructure to translate these new techniques into the clinical domain
The connection of these supercomputing stacks to the data can potentially be achieved via the central data warehouse containing the pre-processed data drawn from many sources
A significant barrier to the application of new Big data techniques into clinical practice is the positioning of these techniques in the current clinical work environment
New and disruptive technologies provided by Big data analytics are likely to be just that… disruptive
Current clinical practice will need to undergo change to incorporate these new data driven techniques
There may need to be sufficient periods of testing of new techniques
especially those which in some way replace a human action and speed the clinical process
Those that aid this process by prioritizing worklists or flagging urgent findings are likely to diffuse rapidly into day to day usage
techniques not previously possible because of the sheer size of data being handled are likely to gain early adoption
A major player in achieving this process will be industry which will enable the incorporation of hardware and software to support Big data handling in the current workflow
If access to data and its analysis is difficult and requires interruption of the normal clinical process
A push button application on a computer screen however
as on an x-ray image viewer that seamlessly activates the program in the background is far more likely to be adopted
Greater success will be achieved with the automatic running of programs triggered by image content and specific imaging examinations
these programs could potentially provide identification of suspicious regions in complex images requiring further scrutiny or undertake quantitative measurements in the background which are then presented through visualization techniques in conjunction with the qualitative structured report
quantitative data can then be compared with that from large populations to define its position in the normal range and
provide predictive data regarding drug response
Part of the attraction for industry in this rapidly expanding arena will obviously be the generation of Intellectual Property (IP)
Development of new techniques useful to clinical departments will require close collaboration between industry and clinical researchers to ensure developments are relevant and rigorously tested in real life scenarios
Although such changes within the operation of the IP system gives new possibilities to researchers
it is hard to forecast the long-term effect
whether and how incentives in health research will be shifted (encouraging/discouraging innovation)
and analysing data such that it can prove useful for precision medicine
While new technologies have greatly increased our ability to generate data, old problems persist and are arguably accentuated by hypothesis-generating methods. A fundamental scientific tenet holds that, for a result to be robust, it must be reproducible. However, it has been reported that results from a concerning proportion of even the highest ranking scientific articles may not be reproduced [~11% only were (30)]
Granted this was a restricted set of highly novel preclinical findings and the experimental methods being used were in some cases advanced enough to require a not always accessible mastery/sophistication for proper execution
the likelihood that such independent validation will even be attempted
energy and cost of generating data increases
Big data is often expensive data but we should not allow a shift toward hypothesis-free experimentation to erode confidence in the conclusions being made
The best—arguably the only—way to improve confidence in the findings from Big data is to work to facilitate transparent validation of findings
The concept of overfitting and model regularization
As we accumulate genomics data of several kinds across a spectrum of diseases
estimating effect sizes becomes more accurate and more accessible
realistic effect sizes enable a priori power calculations and/or simulations
which help reveal whether a study is sufficiently sensitive
or rather doomed by either false negatives or spurious correlations
changes in the way the biomedical community works
with increasing collaboration and communication is also facilitating validation of models built on Big data
are often referred to by a constellation of published studies having no formal connection with the primary initiative
this last observation has important ramifications for the integration of biomedical and analytical training (see below)
Arguably the most remarkable success in the field of Big data receives almost no attention. Despite significant potential for the creation of protectable value, software developers have almost universally made source code freely available through open-source tools that are frequently adapted and improved by the scientific community (42)
Encouraged by organizations such as the Open Bioinformatics Foundation and the International Society for Computational Biology
this quiet revolution has undoubtedly had a huge impact on the rate of progress and our ability to harness the potential of Big data and continues to do so
it is necessary to ensure that measurements made in a training cohort are comparable to those made in a test set
Robust analysis can fail to validate just as easily as overfitted analysis
particularly where patients may come from different healthcare systems
samples may be collected in different ways and data may be generated using different protocols
as well as stool samples for the newly emerging field of microbiome
Basic experimental methodologies involved in sample collection or generation are crucial for the quality of genomics datasets
Twenty-First century omics-generating platforms are often perceived to be so advanced and complex
that should draw most of the planning effort
leaving details on trivial Twentieth-century steps (cell culture
nucleic acid isolation) comparatively overlooked
while experiments performed on poorly generated material would yield only a handful of flawed data points in the pre-genomics era
they would bias thousands of data points at once in the big data era
When thinking at the reasons underlying failed or low-quality omics experiments in daily practice
it is easy to realize that trivialities and logistics played a major role
while sequencing and proteomics platforms are seldom the culprit
standardization of samples also allows for data generated from one individual/ cohort to be used in other related studies and obviates the need to generate the same data over and over again
While this has to be within the remit of ethics and data sharing regulations for each institution/ country
it allows for better use of limited resources (such as clinical material) and funds
Establishing clear principles on data access and sharing is a key step in establishing and maintaining community-wide access to the kind of collaborative sample sharing required to facilitate both discovery and validation
Population-based studies tend to be disease-specific
Thus the EHR provides opportunities to study virtually any disease as well as pleiotropic influences of risk factors such as genetic variation
Since the EHR was not originally designed for evidence generation
leveraging these data is fraught with challenges related to the collection
While opportunities exist to study a spectrum of phenotypes
data contained in the EHR is generally not as rigorous or complete as that collected in a cohort-based study
these EHRs provide potential solutions to problems involving Big data
including the reliability and standardization of data and the accuracy of EHR phenotyping
there are multiple examples of how these challenges are being met by researchers and clinicians across the globe
Standardization across multiple countries and EHR software tools provides a vast opportunity for scalability
As the technical issues with EHR data are addressed
legal and ethical frameworks will be necessary to build and maintain public trust and ensure equitable data access
While illustrating the power of these biorepositories
it is worth noting that these results were not returnable to patients due to restrictions around the ethical approval of the biorepository
While health improvements brought about by the application of Big data techniques are still
the possible benefits of doing so can be seen in those clinical areas already with large
One such area is in clinical imaging where data is invariably digitized and housed in dedicated picture archiving systems
this imaging data is connected with clinical data in the form of image reports
the electronic health record and also carries its own metadata
Because of the ease of handling of this data
that artificial intelligence through machine learning techniques
can exploit Big data to provide clinical benefit
The need for these high-powered computing techniques in part reflects the need to extract hidden information from images which is not readily available from the original datasets
This is in contrast to simple parametric data within the clinical record including physiological readings such as pulse rate or blood pressure
The need for similar data processing is also seen in digitized pathology image specimens
Big data can provide annotated data sets to be used to train artificial intelligence algorithms to recognize clinically relevant conditions or features
In order for the algorithm to learn the relevant features
significant numbers of cases with the feature or condition under scrutiny are required
but different large volumes of cases are used to test the algorithm against gold standard annotations
Once trained to an acceptable level these techniques have the opportunity to provide pre-screening of images to look for cases with high likelihood of disease allowing prioritization of formal reading
Screening tests such as breast mammography could undergo pre-reading by artificial intelligence/machine learning to identify the few positive cases among the many normal studies allowing rapid identification and turnaround
Pre-screening of complex high acuity cases as seen in the trauma setting also allow a focused approach to identify and review areas of concern as a priority
Quantification of structures within an image such as tumor volume
to manage drug therapy of heart failure or following heart attack
can be incorporated into artificial intelligence/machine learning algorithms so they are undertaken automatically rather than requiring painstaking manual segmentation of structures
As artificial intelligence/machine learning continues to improve it has the ability to recognize image features without any pre-training through the application of neural networks which can assimilate different sets of clinical data
The resultant algorithms can then be applied to similar
new clinical information to predict individual patient responses based on large prior patient cohorts
similar techniques can be applied to images to identify sub populations that are otherwise too complex to recognize
artificial intelligence/machine learning may find a role in hypothesis generation by identifying unrecognized
unique image features or combination of features that relate to disease progression or outcome
a subset of patients with memory loss that potentially progress to dementia may have features detectable prior to symptom development
This approach allows large volume population interrogation with prospective clinical follow up and identification of the most clinically relevant image fingerprints
rather than analysis of small volume retrospective data in patients who have already developed symptomatic degenerative brain disease
Despite the vast wealth of data contained in the clinical information technology systems within hospitals
extraction of usable data from the clinical domain is not a trivial task
This is for a number of diverse reasons including: philosophy of data handling; physical data handling infrastructure; the data format; and translation of new advances into the clinical domain
These problems must be addressed prior to successful application of these new methodologies
analysts should not only be trained in informatics but in biomedicine also
While these principles represent a laudable goal
it remains to be seen if and how they might be realized
computational biologists are now promoted for contributing to team scientist as middle authors while producing original work around developing novel approaches to data analysis
we would propose adding a seventh principle here: “Developing novel approaches to data analysis.”
Key proposed principles when assessing scientists
In recent years the field of biomedical research has seen an explosion in the volume
velocity and variety of information available
something that has collectively become known as “Big data.” This hypothesis-generating approach to science is arguably best considered
not as a simple expansion of what has always been done
but rather a complementary means of identifying and inferring meaning from patterns in data
An increasing range of “machine learning” methods allow these patterns or trends to be directly learned from the data itself
rather than pre-specified by researchers relying on prior knowledge
these advances are cause for great optimism
they are less reliant on prior knowledge and hence can facilitate advances in our understanding of biological mechanism through a reductionist “systems medicine” approach
They can also identify patterns in biomedical data that can inform development of clinical biomarkers or indicate unsuspected treatment targets
in order to fully realize the potential inherent in the Big data we can now generate
Forming collaborative networks—sharing samples
and methods—is now more important than ever and increasingly requires building bridges to less traditional collaborating specialities such as engineering
Such increased interaction is unavoidable if we are to ensure that mechanistic inferences drawn from Big data are robust and reproducible
Infrastructure capacity will require constant updating
while regulation and stewardship must reassure the patients from whom it is sourced that their information is handled responsibly
this must be achieved without introducing stringency measures that threaten the access that is necessary for progress to flourish
it is clear that the rapid growth in information is going to continue: Big data is going to keep getting Bigger and the way we teach biomedical science must adapt too
there is clear evidence that each of these challenges can be and is being met in at least some areas
Making the most of Big data will be no mean feat
but the potential benefits are Bigger still
All authors wrote sections of the manuscript
TH and EM put the sections together and finalized the manuscript
a genomic data platform company dealing with big data
OV acknowledges the financial support of the János Bolyai Research Fellowship
Hungarian Academy of Sciences (Bolyai + ÚNKP-18-4-DE-71) and the TÉT_16-1-2016-0093
RS is currently an employee of Synthetic Genomics Inc.
this work only reflects his personal opinions
and he declares no competing financial interest
DH receives research funding or consulting fees from Bristol Myers Squibb
EM is co-founder and CSO of PredictImmune Ltd.
and receives consulting fees from Kymab Ltd
The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest
PubMed Abstract | CrossRef Full Text | Google Scholar
PubMed Abstract | CrossRef Full Text
The concept of integrative levels and biology
Google Scholar
Google Scholar
Google Scholar
PubMed Abstract | CrossRef Full Text | Google Scholar
Next-generation machine learning for biological networks
Disease heritability studies harness the healthcare system to achieve massive scale
Detecting influenza epidemics using search engine query data
CrossRef Full Text | Google Scholar
European Parliament and Council of the European Union
Regulation on the protection of natural persons with regard to the processing of personal data and on the free movement of such data
and repealing Directive 95/46/EC (Data Protection Directive)
14. Intersoft Consulting. Recital 26 - Not Applicable to Anonymous Data, (2018). Available online at: https://gdpr-info.eu/recitals/no-26/
15. Data Protection Network. GDPR and Data Processing Agreements, (2018). Available online at: https://www.dpnetwork.org.uk/gdpr-data-processing-agreements/
16. Intersoft Consulting. Art. 7 GDPR - Conditions for consent, (2018). Available online at: https://gdpr-info.eu/art-7-gdpr/
17. Smith DW. GDPR Runs Risk of Stifling Healthcare Innovation. (2018). Available online at: https://eureka.eu.com/gdpr/gdpr-healthcare/
18. Intersoft Consulting. Art. 17 GDPR - Right to Erasure (‘Right to be Forgotten'). (2018). Available online at: https://gdpr-info.eu/art-17-gdpr/
19. Staten J. GDPR and The End of Reckless Data Sharing, (2018). Available online at: https://www.securityroundtable.org/gdpr-end-reckless-data-sharing/
Ontologies for clinical and translational research: Introduction
CrossRef Full Text | Google Scholar
Spatially explicit data: stewardship and ethical challenges in science
The FAIR guiding principles for scientific data management and stewardship
patents and translational research in genomics: Issues concerning gene patents may be impeding the translation of laboratory research to clinical use
24. Oldham P., Kitsara I. WIPO Manual on Open Source Tools for Patent Analytics. (2016). Available online at: https://wipo-analytics.github.io/
Enhancing patent landscape analysis with visualization output
world patent information (2010) 32:203–20
CrossRef Full Text | Google Scholar
CrossRef Full Text | Google Scholar
Methods of integrating data to uncover genotype-phenotype interactions
G&T-seq: parallel sequencing of single-cell genomes and transcriptomes
Parallel single-cell sequencing links transcriptional and epigenetic heterogeneity
Drug development: raise standards for preclinical cancer research
Machine learning applications in genetics and genomics
Sequencing technology does not eliminate biological variability
CrossRef Full Text | Google Scholar
TEDDY–the environmental determinants of diabetes in the young: an observational clinical trial
Type 1 Diabetes TrialNet–an international collaborative clinical trials network
Ten years of the immune tolerance network: an integrated clinical research organization
Type 1 diabetes trialnet: a multifaceted approach to bringing disease-modifying therapy to clinical use in type 1 diabetes
Cell-of-origin patterns dominate the molecular classification of 10,000 tumors from 33 types of cancer
Fast-TRKing drug development for rare molecular Targets Cancer Discov (2017) 7:934–6
CrossRef Full Text | Google Scholar
Alternative models for sharing confidential biomedical data
Predicting human olfactory perception from chemical features of odor molecules
Open-source software accelerates bioinformatics
An epidemiological perspective of personalized medicine: the estonian experience
The next generation of precision medicine: observational studies
Comparison of manual and automated nucleic acid extraction from whole-blood samples
Next-generation sequencing of RNA and DNA isolated from paired fresh-frozen and formalin-fixed paraffin-embedded samples of human cancer and normal tissue
Performance comparison of three DNA extraction kits on human whole-exome data from formalin-fixed paraffin-embedded normal and tumor samples
Towards standards for human fecal sample processing in metagenomic studies
SaVanT: a web-based tool for the sample-level visualization of molecular signatures in gene expression profiles
Cell-type deconvolution from DNA methylation: a review of recent applications
Analysis of protein-coding genetic variation in 60,706 humans
Global Alliance for Genomics and Health GENOMICS
CrossRef Full Text
Systematic reanalysis of clinical exome data yields additional diagnoses: implications for providers
Making new genetic diagnoses with old data: iterative reanalysis and reporting from genome-wide data in 1,133 families with developmental disorders
The work of the human proteome organisation's proteomics standards initiative (HUPO PSI)
Standards and guidelines for the interpretation of sequence variants: a joint consensus recommendation of the american college of medical genetics and genomics and the association for molecular pathology
Evaluating the clinical validity of gene-disease associations: an evidence-based framework developed by the clinical genome resource
Data harmonization for a molecularly driven health system
The monarch initiative: an integrative data and analytic platform connecting phenotypes to genotypes across species
Big data and machine learning in health care
Big data from electronic health records for early and late translational cardiovascular research: challenges and potential
Mechanistic phenotypes: an aggregative phenotyping strategy to identify disease mechanisms using GWAS data
Validation of electronic medical record-based phenotyping algorithms: results and lessons learned from the eMERGE network
The IGNITE network: a model for genomic medicine implementation and research
Using systems approaches to address challenges for clinical implementation of pharmacogenomics
The electronic medical records and genomics (eMERGE) network: past
Phenome-wide association studies as a tool to advance precision medicine
Phenome-wide scanning identifies multiple diseases and disease severity phenotypes associated with HLA variants
Phenotype risk scores identify patients with unrecognized mendelian disease patterns
Swift action needed to close the skills gap in bioinformatics
75. Coursera Big Data Integration and Processing (2018). Available online at: https://www.coursera.org/learn/big-data-integration-processing
76. i2b2 tranSMART Foundation. The i2b2 tranSMART Foundation 2019 Training Program (2019). Available online at: https://transmartfoundation.org/the-i2b2-transmart-foundation-training-program/
77. ELIXIR. ELIXIR Workshops and Courses, (2018). Available online at: https://www.elixir-europe.org/events?f[0]=field_type:15
78. European Bioinformatics Institute. Introduction to Multiomics Data Integration (2019). Available online at: https://www.ebi.ac.uk/training/events/2019/introduction-multiomics-data-integration-1
Combined measurement of the higgs boson mass in pp collisions at sqrt[s] = 7 and 8 TeV with the ATLAS and CMS experiments
PubMed Abstract | CrossRef Full Text | Google Scholar
Hafler DA and McKinney EF (2019) From Big Data to Precision Medicine
Received: 14 July 2018; Accepted: 04 February 2019; Published: 01 March 2019
Copyright © 2019 Hulsen, Jamuar, Moody, Karnes, Varga, Hedensted, Spreafico, Hafler and McKinney. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY)
distribution or reproduction in other forums is permitted
provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited
in accordance with accepted academic practice
distribution or reproduction is permitted which does not comply with these terms
*Correspondence: Tim Hulsen, dGltLmh1bHNlbkBwaGlsaXBzLmNvbQ== Eoin F. McKinney, ZWZtMzBAbWVkc2NobC5jYW0uYWMudWs=
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations
Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher
94% of researchers rate our articles as excellent or goodLearn more about the work of our research integrity team to safeguard the quality of each article we publish
The 2025 IndyCar Series season is underway, and Christian Lundgaard is in his first season with Arrow McLaren after three with Rahal Letterman Lanigan
Here's what you should know about Christian Lundgaard:
St. Petersburg: 8th
Thermal: 3rd after starting 2nd
Long Beach: 3rd after starting 12th
Barber: 2nd after starting 7th
Open-wheel series: Here are the key differences between F1 and IndyCar in 2025
IndyCar rule changes: Here are 5 fans should know for 2025
How can I listen to, stream IndyCar races in 2025?IndyCar Nation is on SiriusXM Channel 218, IndyCar Live and the IndyCar Radio Network (check affiliates for each race)
The 2025 IndyCar Series schedule includes 17 races
May 25, Indianapolis 500 *, 12:45 p.m.
June 22, Elkhart Lake, Wisconsin &, 2:30 p.m.
(Team and drivers; *-Indianapolis 500 only)
Logout Gateway to the global feed industry
NutriScan has been acquired by the production and distribution company R2 Group in Hedensted
NutriScan will be part of R2 Group’s strategic business unit Agro
NutriScan fits perfectly into the other business units of SBU Agro including R2 Feed Partner
He emphasises that both R2 Feed Partner and NutriScan will continue to sell via distributors and not directly to farmers and that new solutions to the industry will obviously always be for the benefit of the feed industry
In 2011 R2 Group chose to sell 50 % of its shares to Maj Invest ensuring the financial foundation to continue the aggressive expansion plan which has lead to the acquisition of NutriScan among other companies
R2 Group also sells raw material and ingredients to other industries such as the food industry
the chemico-technical industry and the pharmaceutical industry
R2 Group will have nearly 100 employees in Denmark
Here's what you need to know about Christian Lundgaard:
Pole day report: McLaughlin leads Team Penske front row sweep for 2024 Indy 500
The 108th running of the Indianapolis 500 takes place Sunday
These drivers are in the race for the first time:
Indy 500 live race results: Follow along for the latest
Chris Sims is a digital content producer at Midwest Connect Gannett. Follow him on Twitter: @ChrisFSims.
slower-growing broilers are on people’s minds
But Danish broiler farmer Rune Thomsen sees no immediate need to switch from fast to slower growth
“There is no current demand to cater for.” The grow-out houses on Danish broiler farms have the same setup and design as almost everywhere else
That is immediately clear when entering the broiler houses of the Williamsborg farm in Hedensted
combined top and longitudinal ventilGet full access to all stories on Poultry WorldThis Premium article is exclusively available for subscribers
Obtain insights from exclusive interviews
Dive into articles covering trending industry topics
Get a glimpse into poultry farms worldwide
Already subscribed? Click here to login
A kangaroo is roaming around Denmark and the country's police are appealing for help to catch the animal
Some locals said the kangaroo might have been staying in farmland for about eight years
Denmark authorities were baffled when a kangaroo was observed hopping through a farm in the south-east of the country, as per Daily Mail
A kangaroo hopped loose next to a road in Denmark on Monday morning
and local police stated they have no idea where the animal came from
Police released video of the Kangaroo bounding through a field in the tiny Danish village of Oster Ulslev on social media
No one has reported the kangaroo as missing
and the local zoos claim it doesn't belong to them
speculation grew that an Australian Kangaroo is seen through a field in the small Danish village
According to locals, it was the same marsupial that has eluded officials for years, while some said it was a local pet
claimed to have seen the same kangaroo three kilometers away from Oster Ulslev the previous Thursday
they had one bouncing around in the Hedensted neighborhood for almost a year and it still hasn't been caught
"There was a kangaroo loose last summer," Claudia Ringmark said on Facebook
said it may have escaped from Vordingborg on Masned island
claims that the same kangaroo has been on the run for the past eight years
we have been trying to catch it without luck
we moved to Oster Ulslev in 2017 when it was already on the run from the owner
it has probably been on the loose since 2014."
Mr Kroman said he lunged at that same kangaroo when trying to catch it
let alone impossible and that he nearly caught the animal in 2019
Mr Kroman added that he was just about two metres away from the animal when it jumped
On Monday, police in southeastern Denmark asked for the public's assistance in tracking down a kangaroo that was spotted hopping across a field, as per New York Post.
Any sightings or information about the animal's whereabouts should be reported to the non-emergency number 114, according to the South Zealand and Lolland-Falster Police. The animal isn't seen as harmful.
Despite the fact that kangaroos are uncommon in northern Europe, this is the second time the same police district has asked for assistance in locating one. A kangaroo escaped from a private animal farm in the same location in 2014.
In July 2018, a kangaroo went missing for half a day in Denmark before being located by its owner.
For more news, updates about kangaroos and similar topics don't forget to follow Nature World News!
© 2025 NatureWorldNews.com All rights reserved. Do not reproduce without permission.
Discover more about
Browse the latest jobs
Christian LundgaardHOME TOWN Hedensted / DenmarkD.O.B
Christian kicked off his professional karting journey at 11 years old and showed his prowess from a young age
His career quickly took off as he won multiple major championships throughout European karting competition
Christian made the jump to single-seater racing in 2017 when he debuted in the Spanish F4 Championship and the SMP F4 Championship at 15 years old
He then competed in the GP3 Series Championship in 2018 where he finished second in the standings
Christian emerged as an Alpine development driver competing in the 2019 FIA Formula 3 Championship
He then made the jump to the FIA Formula 2 Championship under Alpine support
where he registered two wins and nine podiums across the 2020-21 seasons
He next shifted gears to the North American motorsports scene where he made in immediate splash as the 2022 NTT INDYCAR SERIES Rookie of the Year with Rahal Letterman Lanigan
with which he raced through his first three seasons in the series
Christian has produced a strong start to his campaign in the NTT INDYCAR SERIES
He remains the only Danish driver in INDYCAR SERIES history to boast a race win after a dominant performance on the streets of Toronto in 2023
will join Arrow McLaren for the 2025 season and beyond
View all IndyCar team members
The Local Europe ABVästmannagatan 43113 25 StockholmSweden
Please log in here to leave a comment