A Deep Dissertion of Data Science: Related Issues and its Applications

Ieee account.

  • Change Username/Password
  • Update Address

Purchase Details

  • Payment Options
  • Order History
  • View Purchased Documents

Profile Information

  • Communications Preferences
  • Profession and Education
  • Technical Interests
  • US & Canada: +1 800 678 4333
  • Worldwide: +1 732 981 0060
  • Contact & Support
  • About IEEE Xplore
  • Accessibility
  • Terms of Use
  • Nondiscrimination Policy
  • Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.

  • IEEE Xplore Digital Library
  • IEEE Standards
  • IEEE Spectrum



IEEE Talks Big Data - Check out our new Q&A article series with big Data experts!

Call for Papers - Check out the many opportunities to submit your own paper. This is a great way to get published, and to share your research in a leading IEEE magazine!

Publications - See the list of various IEEE publications related to big data and analytics here.

Call for Blog Writers!

IEEE Cloud Computing Community is a key platform for researchers, academicians and industry practitioners to share and exchange ideas regarding cloud computing technologies and services, as well as identify the emerging trends and research topics that are defining the future direction of cloud computing. Come be part of this revolution as we invite blog posts in this regard and not limited to the list provided below:

  • Cloud Deployment Frameworks
  • Cloud Architecture
  • Cloud Native Design Patterns
  • Testing Services and Frameworks
  • Storage Architectures
  • Big Data and Analytics
  • Internet of Things
  • Virtualization techniques
  • Legacy Modernization
  • Security and Compliance
  • Pricing Methodologies
  • Service Oriented Architecture
  • Microservices
  • Container Technology
  • Cloud Computing Impact and Trends shaping today’s business
  • High availability and reliability

Call for Papers

No call for papers at this time.

IEEE Publications on Big Data


Read more at IEEE Computer Society.



IEEE Computer Magazine Special Issue on Big Data Management

  • Big Data: Promises and Problems


Connecting the Dots With Big Data

  • Better Health Care Through Data
  • The Future of Crime Prevention
  • Census and Sensibility
  • Landing a Job in Big Data

Read more at The Institute.

Download full issue. (PDF, 5 MB)

IEEE Internet Computing - July/August 2014

IEEE Internet Computing July/August 2014

Web-Scale Datacenters

This issue of Internet Computing surveys issues surrounding Web-scale datacenters, particularly in the areas of cloud provisioning as well as networking optimization and configuration. They include workload isolation, recovery from transient server availability, network configuration, virtual networking, and content distribution.

Read more at IEEE Computer Society .

IEEE Network - July 2014

Networking for Big Data

The most current information for communications professionals involved with the interconnection of computing systems, this bimonthly magazine covers all aspects of data and computer communications.

Read more at IEEE Communications Society .


Special Issue on Big Data

Big data is transforming our lives, but it is also placing an unprecedented burden on our compute infrastructure. As data expansion rates outpace Moore's law and supply voltage scaling grinds to a halt, the IT industry is being challenged in its ability to effectively store, process, and serve the growing volumes of data. Delivering on the premise of big data in the post­Dennard era calls for specialization and tight integration across the system stack, with the aim of maximizing energy efficiency, performance scalability, resilience, and security.


The Trusted Solution for Open Access Publishing

Icon: Fully Open Access Journals (Topicals)

Fully Open Access Topical Journals

IEEE offers over 30 technically focused gold fully open access journals spanning a wide range of fields.

Icon: Hybrid Journals

Hybrid Open Access Journals

IEEE offers 180+ hybrid journals that support open access, including many of the top-cited titles in the field. These titles have Transformative Status under Plan S.

Icon: IEEE Access

IEEE Access

The multidisciplinary, gold fully open access journal of the IEEE, publishing high quality research across all of IEEE’s fields of interest.

About IEEE Open

About IEEE Open

Many authors in today’s publishing environment want to make access to research freely available to all reader communities. To help authors gain maximum exposure for their groundbreaking research, IEEE provides a variety of open access options to meet the needs of authors and institutions.

Call for Papers

Call for Papers

Browse our fully open access topical journals and submit a paper.

News & Events

IEEE Announces 6 New Fully Open Access Journals and 3 Hybrid Journals Coming in 2024

IEEE Commits its Entire Hybrid Journal Portfolio to Transformative Journal Status Aligned with Plan S

IEEE and CRUI Sign Three-Year Transformative Agreement to Accelerate Open Access Publishing in Italy

New IEEE Open Access Journals Receive First Impact Factors

research papers on data science ieee

IEEE Access, a Multidisciplinary, Open Access Journal

IEEE Access is a multidisciplinary, online-only, gold fully open access journal, continuously presenting the results of original research or development across all IEEE fields of interest. Supported by article processing charges (APCs), its hallmarks are rapid peer review, a submission-to-publication time of 4 to 6 weeks, and articles that are freely available to all readers.

research papers on data science ieee

Now On-Demand

How to publish open access with ieee.

This newly published on-demand webinar will provide authors with best practices in preparing a manuscript, navigating the journal submission process, and important tips to help an author get published. It will also review the opportunities authors and academic institutions have to enhance the visibility and impact of their research by publishing in the many open access options available from IEEE.

Register Now

research papers on data science ieee

IEEE Publications Dominate Latest Citation Rankings

Each year, the Journal Citation Reports® (JCR) from Web of Science Group examines the influence and impact of scholarly research journals. JCR reveals the relationship between citing and cited journals, offering a systematic, objective means to evaluate the world’s leading journals. The 2022 JCR study, released in June 2023, reveals that IEEE journals continue to maintain rankings at the top of their fields.

data science Recently Published Documents

Total documents.

  • Latest Documents
  • Most Cited Documents
  • Contributed Authors
  • Related Sources
  • Related Keywords

Assessing the effects of fuel energy consumption, foreign direct investment and GDP on CO2 emission: New data science evidence from Europe & Central Asia

Documentation matters: human-centered ai system to assist data science code documentation in computational notebooks.

Computational notebooks allow data scientists to express their ideas through a combination of code and documentation. However, data scientists often pay attention only to the code, and neglect creating or updating their documentation during quick iterations. Inspired by human documentation practices learned from 80 highly-voted Kaggle notebooks, we design and implement Themisto, an automated documentation generation system to explore how human-centered AI systems can support human data scientists in the machine learning code documentation scenario. Themisto facilitates the creation of documentation via three approaches: a deep-learning-based approach to generate documentation for source code, a query-based approach to retrieve online API documentation for source code, and a user prompt approach to nudge users to write documentation. We evaluated Themisto in a within-subjects experiment with 24 data science practitioners, and found that automated documentation generation techniques reduced the time for writing documentation, reminded participants to document code they would have ignored, and improved participants’ satisfaction with their computational notebook.

Data science in the business environment: Insight management for an Executive MBA

Adventures in financial data science, gecoagent: a conversational agent for empowering genomic data extraction and analysis.

With the availability of reliable and low-cost DNA sequencing, human genomics is relevant to a growing number of end-users, including biologists and clinicians. Typical interactions require applying comparative data analysis to huge repositories of genomic information for building new knowledge, taking advantage of the latest findings in applied genomics for healthcare. Powerful technology for data extraction and analysis is available, but broad use of the technology is hampered by the complexity of accessing such methods and tools. This work presents GeCoAgent, a big-data service for clinicians and biologists. GeCoAgent uses a dialogic interface, animated by a chatbot, for supporting the end-users’ interaction with computational tools accompanied by multi-modal support. While the dialogue progresses, the user is accompanied in extracting the relevant data from repositories and then performing data analysis, which often requires the use of statistical methods or machine learning. Results are returned using simple representations (spreadsheets and graphics), while at the end of a session the dialogue is summarized in textual format. The innovation presented in this article is concerned with not only the delivery of a new tool but also our novel approach to conversational technologies, potentially extensible to other healthcare domains or to general data science.

Differentially Private Medical Texts Generation Using Generative Neural Networks

Technological advancements in data science have offered us affordable storage and efficient algorithms to query a large volume of data. Our health records are a significant part of this data, which is pivotal for healthcare providers and can be utilized in our well-being. The clinical note in electronic health records is one such category that collects a patient’s complete medical information during different timesteps of patient care available in the form of free-texts. Thus, these unstructured textual notes contain events from a patient’s admission to discharge, which can prove to be significant for future medical decisions. However, since these texts also contain sensitive information about the patient and the attending medical professionals, such notes cannot be shared publicly. This privacy issue has thwarted timely discoveries on this plethora of untapped information. Therefore, in this work, we intend to generate synthetic medical texts from a private or sanitized (de-identified) clinical text corpus and analyze their utility rigorously in different metrics and levels. Experimental results promote the applicability of our generated data as it achieves more than 80\% accuracy in different pragmatic classification problems and matches (or outperforms) the original text data.

Impact on Stock Market across Covid-19 Outbreak

Abstract: This paper analysis the impact of pandemic over the global stock exchange. The stock listing values are determined by variety of factors including the seasonal changes, catastrophic calamities, pandemic, fiscal year change and many more. This paper significantly provides analysis on the variation of listing price over the world-wide outbreak of novel corona virus. The key reason to imply upon this outbreak was to provide notion on underlying regulation of stock exchanges. Daily closing prices of the stock indices from January 2017 to January 2022 has been utilized for the analysis. The predominant feature of the research is to analyse the fact that does global economy downfall impacts the financial stock exchange. Keywords: Stock Exchange, Matplotlib, Streamlit, Data Science, Web scrapping.

Information Resilience: the nexus of responsible and agile approaches to information use

AbstractThe appetite for effective use of information assets has been steadily rising in both public and private sector organisations. However, whether the information is used for social good or commercial gain, there is a growing recognition of the complex socio-technical challenges associated with balancing the diverse demands of regulatory compliance and data privacy, social expectations and ethical use, business process agility and value creation, and scarcity of data science talent. In this vision paper, we present a series of case studies that highlight these interconnected challenges, across a range of application areas. We use the insights from the case studies to introduce Information Resilience, as a scaffold within which the competing requirements of responsible and agile approaches to information use can be positioned. The aim of this paper is to develop and present a manifesto for Information Resilience that can serve as a reference for future research and development in relevant areas of responsible data management.

qEEG Analysis in the Diagnosis of Alzheimers Disease; a Comparison of Functional Connectivity and Spectral Analysis

Alzheimers disease (AD) is a brain disorder that is mainly characterized by a progressive degeneration of neurons in the brain, causing a decline in cognitive abilities and difficulties in engaging in day-to-day activities. This study compares an FFT-based spectral analysis against a functional connectivity analysis based on phase synchronization, for finding known differences between AD patients and Healthy Control (HC) subjects. Both of these quantitative analysis methods were applied on a dataset comprising bipolar EEG montages values from 20 diagnosed AD patients and 20 age-matched HC subjects. Additionally, an attempt was made to localize the identified AD-induced brain activity effects in AD patients. The obtained results showed the advantage of the functional connectivity analysis method compared to a simple spectral analysis. Specifically, while spectral analysis could not find any significant differences between the AD and HC groups, the functional connectivity analysis showed statistically higher synchronization levels in the AD group in the lower frequency bands (delta and theta), suggesting that the AD patients brains are in a phase-locked state. Further comparison of functional connectivity between the homotopic regions confirmed that the traits of AD were localized in the centro-parietal and centro-temporal areas in the theta frequency band (4-8 Hz). The contribution of this study is that it applies a neural metric for Alzheimers detection from a data science perspective rather than from a neuroscience one. The study shows that the combination of bipolar derivations with phase synchronization yields similar results to comparable studies employing alternative analysis methods.

Big Data Analytics for Long-Term Meteorological Observations at Hanford Site

A growing number of physical objects with embedded sensors with typically high volume and frequently updated data sets has accentuated the need to develop methodologies to extract useful information from big data for supporting decision making. This study applies a suite of data analytics and core principles of data science to characterize near real-time meteorological data with a focus on extreme weather events. To highlight the applicability of this work and make it more accessible from a risk management perspective, a foundation for a software platform with an intuitive Graphical User Interface (GUI) was developed to access and analyze data from a decommissioned nuclear production complex operated by the U.S. Department of Energy (DOE, Richland, USA). Exploratory data analysis (EDA), involving classical non-parametric statistics, and machine learning (ML) techniques, were used to develop statistical summaries and learn characteristic features of key weather patterns and signatures. The new approach and GUI provide key insights into using big data and ML to assist site operation related to safety management strategies for extreme weather events. Specifically, this work offers a practical guide to analyzing long-term meteorological data and highlights the integration of ML and classical statistics to applied risk and decision science.

Export Citation Format

Share document.


Data science is an inter-disciplinary field that uses scientific methods, processes, algorithms and systems to extract knowledge and insights from structured and unstructured data. Data science is related to data mining and big data.


Ieee projects 2022, seminar reports, free ieee projects ieee papers.

Help | Advanced Search

Computer Science > Machine Learning

Title: kan: kolmogorov-arnold networks.

Abstract: Inspired by the Kolmogorov-Arnold representation theorem, we propose Kolmogorov-Arnold Networks (KANs) as promising alternatives to Multi-Layer Perceptrons (MLPs). While MLPs have fixed activation functions on nodes ("neurons"), KANs have learnable activation functions on edges ("weights"). KANs have no linear weights at all -- every weight parameter is replaced by a univariate function parametrized as a spline. We show that this seemingly simple change makes KANs outperform MLPs in terms of accuracy and interpretability. For accuracy, much smaller KANs can achieve comparable or better accuracy than much larger MLPs in data fitting and PDE solving. Theoretically and empirically, KANs possess faster neural scaling laws than MLPs. For interpretability, KANs can be intuitively visualized and can easily interact with human users. Through two examples in mathematics and physics, KANs are shown to be useful collaborators helping scientists (re)discover mathematical and physical laws. In summary, KANs are promising alternatives for MLPs, opening opportunities for further improving today's deep learning models which rely heavily on MLPs.

Submission history

Access paper:.

  • Other Formats

license icon

References & Citations

  • Google Scholar
  • Semantic Scholar

BibTeX formatted citation

BibSonomy logo

Bibliographic and Citation Tools

Code, data and media associated with this article, recommenders and search tools.

  • Institution

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .

DHS Informatics

IEEE 2023-2024 : Data Science Projects


For Outstation Students, we are having online project classes both technical and coding using net-meeting software

For details, call: 9886692401/9845166723.

DHS Informatics  providing  latest 2021-2022 IEEE projects  on Data science for the final year engineering students. DHS Informatics trains all students to develop their project with good idea what they need to submit in college to get good marks. DHS Informatics offers placement training in Bangalore and the program name is  OJT  –  On Job Training , job seekers as well as final year college students can join in this placement training program and job opportunities in their dream IT companies. We are providing IEEE projects for B.E / B.TECH, M.TECH, MCA, BCA, DIPLOMA students from more than two decades.

Python  Final year CSE projects in Bangalore

  • Python 2021 – 2022 IEEE PYTHON PROJECTS CSE | ECE | ISE


A data mining based model for detection of fraudulent behaviour in water consumption.

Abstract:  Fraudulent behavior in drinking water consumption is a significant problem facing water supplying companies and agencies. This behavior results in a massive loss of income and forms the highest percentage of non-technical loss. Finding efficient measurements for detecting fraudulent activities has been an active research area in recent years. Intelligent data mining techniques can help water supplying companies to detect these fraudulent activities to reduce such losses. This research explores the use of two classification techniques (SVM and KNN) to detect suspicious fraud water customers. The main motivation of this research is to assist Yarmouk Water Company (YWC) in Irbid city of Jordan to overcome its profit loss. The SVM based approach uses customer load profile attributes to expose abnormal behavior that is known to be correlated with non-technical loss activities. The data has been collected from the historical data of the company billing system. The accuracy of the generated model hit a rate of over 74% which is better than the current manual prediction procedures taken by the YWC. To deploy the model, a decision tool has been built using the generated model. The system will help the company to predict suspicious water customers to be inspected on site.                                                                                                                                                                                                                                   

Correlated Matrix Factorization for Recommendation with Implicit Feedback

Abstract:  As a typical latent factor model, Matrix Factorization (MF) has demonstrated its great effectiveness in recommender systems. Users and items are represented in a shared low-dimensional space so that the user preference can be modeled by linearly combining the item factor vector V using the user-specific coefficients U. From a generative model perspective, U and V are drawn from two independent Gaussian distributions, which is not so faithful to the reality. Items are produced to maximally meet users’ requirements, which makes U and V strongly correlated. Meanwhile, the linear combination between U and V forces a bisection (one-to-one mapping), which thereby neglects the mutual correlation between the latent factors. In this paper, we address the upper drawbacks, and propose a new model, named Correlated Matrix Factorization (CMF). Technically, we apply Canonical Correlation Analysis (CCA) to map U and V into a new semantic space. Besides achieving the optimal fitting on the rating matrix, one component in each vector (U or V) is also tightly correlated with every single component in the other. We derive efficient inference and learning algorithms based on variational EM methods. The effectiveness of our proposed model is comprehensively verified on four public data sets. Experimental results show that our approach achieves competitive performance on both prediction accuracy and efficiency compared with the current state of the art.                                                                                                                                                                                         

Heterogeneous Information Network Embedding for Recommendation

Abstract:  Due to the flexibility in modelling data heterogeneity, heterogeneous information network (HIN) has been adopted to characterize complex and heterogeneous auxiliary data in recommended systems, called HIN based recommendation. It is challenging to develop effective methods for HIN based recommendation in both extraction and exploitation of the information from HINs. Most of HIN based recommendation methods rely on path based similarity, which cannot fully mine latent structure features of users and items. In this paper, we propose a novel heterogeneous network embedding based approach for HIN based recommendation, called HERec. To embed HINs, we design a meta-path based random walk strategy to generate meaningful node sequences for network embedding. The learned node embeddings are first transformed by a set of fusion functions, and subsequently integrated into an extended matrix factorization (MF) model. The extended MF model together with fusion functions are jointly optimized for the rating prediction task. Extensive experiments on three real-world datasets demonstrate the effectiveness of the HERec model. Moreover, we show the capability of the HERec model for the cold-start problem, and reveal that the transformed embedding information from HINs can improve the recommendation performance.                                                         

NetSpam: A Network-Based Spam Detection Framework for Reviews in Online Social Media

Abstract:  Nowadays, a big part of people rely on available content in social media in their decisions (e.g., reviews and feedback on a topic or product). The possibility that anybody can leave a review provides a golden opportunity for spammers to write spam reviews about products and services for different interests. Identifying these spammers and the spam content is a hot topic of research, and although a considerable number of studies have been done recently toward this end, but so far the methodologies put forth still barely detect spam reviews, and none of them show the importance of each extracted feature type. In this paper, we propose a novel framework, named NetSpam, which utilizes spam features for modeling review data sets as heterogeneous information networks to map spam detection procedure into a classification problem in such networks. Using the importance of spam features helps us to obtain better results in terms of different metrics experimented on real-world review data sets from Yelp and Amazon Web sites. The results show that NetSpam outperforms the existing methods and among four categories of features, including review-behavioral, user-behavioral, review-linguistic, and user-linguistic, the first type of features performs better than the other categories.                                                                                                                                                                                         

Comparative Study to Identify the Heart Disease Using Machine Learning Algorithms

Abstract: Nowadays, heart disease is a common and frequently present disease in the human body and it’s also hunted lots of humans from this world. Especially in the USA, every year mass people are affected by this disease after that in India also. Doctor and clinical research said that heart disease is not a suddenly happen disease it’s the cause of continuing irregular lifestyle and different body’s activity for a long period after then it’s appeared in sudden with symptoms. After appearing those symptoms people seek for a treat in hospital for taken different test and therapy but these are a little expensive. So awareness before getting appeared in this disease people can get an idea about the patient condition from this research result. This research collected data from different sources and split that data into two parts like 80% for the training dataset and the rest 20% for the test dataset. Using different classifier algorithms tried to get better accuracy and then summarize that accuracy. These algorithms are namely Random Forest Classifier, Decision Tree Classifier, Support Vector Machine, k-nearest neighbor, Logistic Regression, and Naive Bayes. SVM, Logistic Regression, and KNN gave the same and better accuracy as other algorithms. This paper proposes a development that which factor is vulnerable to heart disease given basic prefix like sex, glucose, Blood pressure, Heart rate, etc. The future direction of this paper is using different devices and clinical trials for the real-life experiment.

A machine learning approach for opinion mining online customer reviews

Abstract :This study was conducted to apply supervised machine learning methods in opinion mining online customer reviews. First, the study automatically collected 39,976 traveler reviews on hotels in Vietnam on Agoda.com website, then conducted the training with machine learning models to find out which model is most compatible with the training dataset and apply this model to forecast opinions for the collected dataset. The results showed that Logistic Regression (LR), Support Vector Machines (SVM) and Neural Network (NN) methods have the best performance in opinion mining in Vietnamese language. This study is valuable as a reference for applications of opinion mining in the field of business.

Hybrid Machine Learning Classification Technique for Improve Accuracy of Heart Disease

Abstract: The area of medical science has attracted great attention from researchers. Several causes for human early mortality have been identified by a decent number of investigators. The related literature has confirmed that diseases are caused by different reasons and one such cause is heart-based sicknesses. Many researchers proposed idiosyncratic methods to preserve human life and help health care experts to recognize, prevent and manage heart disease. Some of the convenient methodologies facilitate the expert’s decision but every successful scheme has its own restrictions. The proposed approach robustly analyze an act of Hidden Markov Model (HMM), Artificial Neural Network (ANN), Support Vector Machine (SVM), and Decision Tree J48 along with the two different feature selection methods such as Correlation Based Feature Selection (CFS) and Gain Ratio. The Gain Ratio accompanies the Ranker method over a different group of statistics. After analyzing the procedure the intended method smartly builds Naive Bayes processing that utilizes the operation of two most appropriate processes with suitable layered design. Initially, the intention is to select the most appropriate method and analyzing the act of available schemes executed with different features for examining the statistics.

Novel Supervised Machine Learning Classification Technique for Improve Accuracy of Multi-Valued Datasets in Agriculture

Abstract: In the modern era, many reasons for agricultural plant disease due to unfavorable weather conditions. Many reasons that influence disease in agricultural plants include variety/hybrid genetics, the lifetime of plants at the time of infection, environment(soil, climate), weather (temperature, wind, rain, hail, etc), single versus mixed infections, and genetics of the pathogen populations. Due to these factors, diagnosis of plant diseases at the early stages can be a difficult task. Machine Learning (ML) classification techniques such as Naïve Bayes (NB) and Neural Network (NN) techniques were compared to develop a novel technique to improve the level of accuracy

Machine Learning and Deep Learning Approaches for Brain Disease Diagnosis: Principles and Recent Advances

Abstract: Brain is the controlling center of our body. With the advent of time, newer and newer brain diseases are being discovered. Thus, because of the variability of brain diseases, existing diagnosis or detection systems are becoming challenging and are still an open problem for research. Detection of brain diseases at an early stage can make a huge difference in attempting to cure them. In recent years, the use of artificial intelligence (AI) is surging through all spheres of science, and no doubt, it is revolutionizing the field of neurology. Application of AI in medical science has made brain disease prediction and detection more accurate and precise. In this study, we present a review on recent machine learning and deep learning approaches in detecting four brain diseases such as Alzheimer’s disease (AD), brain tumor, epilepsy, and Parkinson’s disease. 147 recent articles on four brain diseases are reviewed considering diverse machine learning and deep learning approaches, modalities, datasets etc. Twenty-two datasets are discussed which are used most frequently in the reviewed articles as a primary source of brain disease data. Moreover, a brief overview of different feature extraction techniques that are used in diagnosing brain diseases is provided. Finally, key findings from the reviewed articles are summarized and a number of major issues related to machine learning/deep learning-based brain disease diagnostic approaches are discussed. Through this study, we aim at finding the most accurate technique for detecting different brain diseases which can be employed for future betterment.

Prediction of Chronic Kidney Disease - A Machine Learning Perspective

Abstract: Chronic Kidney Disease is one of the most critical illness nowadays and proper diagnosis is required as soon as possible. Machine learning technique has become reliable for medical treatment. With the help of a machine learning classifier algorithms, the doctor can detect the disease on time. For this perspective, Chronic Kidney Disease prediction has been discussed in this article. Chronic Kidney Disease dataset has been taken from the UCI repository. Seven classifier algorithms have been applied in this research such as artificial neural network, C5.0, Chi-square Automatic interaction detector, logistic regression, linear support vector machine with penalty L1 & with penalty L2 and random tree. The important feature selection technique was also applied to the dataset. For each classifier, the results have been computed based on (i) full features, (ii) correlation-based feature selection, (iii) Wrapper method feature selection, (iv) Least absolute shrinkage and selection operator regression, (v) synthetic minority over-sampling technique with least absolute shrinkage and selection operator regression selected features, (vi) synthetic minority over-sampling technique with full features. From the results, it is marked that LSVM with penalty L2 is giving the highest accuracy of 98.86% in synthetic minority over-sampling technique with full features. Along with accuracy, precision, recall, F-measure, area under the curve and GINI coefficient have been computed and compared results of various algorithms have been shown in the graph. Least absolute shrinkage and selection operator regression selected features with synthetic minority over-sampling technique gave the best after synthetic minority over-sampling technique with full features. In the synthetic minority over-sampling technique with least absolute shrinkage and selection operator selected features, again linear support vector machine gave the highest accuracy of 98.46%. Along with machine learning models one deep neural network has been applied on the same dataset and it has been noted that deep neural network achieved the highest accuracy of 99.6%

Potato Disease Detection Using Machine Learning

Abstract: In Bangladesh potato is one of the major crops. Potato cultivation has been very popular in Bangladesh for the last few decades. But potato production is being hampered due to some diseases which are increasing the cost of farmers in potato production. However, some potato diseases are hampering potato production that is increasing the cost of farmers. Which is disrupting the life of the farmer. An automated and rapid disease detection process to increase potato production and digitize the system. Our main goal is to diagnose potato disease using leaf pictures that we are going to do through advanced machine learning technology. This paper offers a picture that is processing and machine learning based automated systems potato leaf diseases will be identified and classified. Image processing is the best solution for detecting and analyzing these diseases. In this analysis, picture division is done more than 2034 pictures of unhealthy potato and potato’s leaf, which is taken from openly accessible plant town information base and a few pre-prepared models are utilized for acknowledgment and characterization of sick and sound leaves. Among them, the program predicts with an accuracy of 99.23% in testing with 25% test data and 75% train data. Our output has shown that machine learning exceeds all existing tasks in potato disease detection.

A Comparative Evaluation of Traditional Machine Learning and Deep Learning Classification Techniques for Sentiment Analysis

Abstract :With the technological advancement in the field of digital transformation, the use of the internet and social media has increased immensely. Many people use these platforms to share their views, opinions and experiences. Analyzing such information is significant for any organization as it apprises the organization to understand the need of their customers. Sentiment analysis is an intelligible way to interpret the emotions from the textual information and it helps to determine whether that emotion is positive or negative. This paper outlines the data cleaning and data preparation process for sentiment analysis and presents experimental findings that demonstrates the comparative performance analysis of various classification algorithms. In this context, we have analyzed various machine learning techniques (Support Vector Machine, and Multinomial Naive Bayes) and deep learning techniques (Bidirectional Encoder Representations from Transformers, and Long Short-Term Memory) for sentiment analysis

A Comprehensive Review on Email Spam Classification using Machine Learning Algorithms

Abstract: Email is the most used source of official communication method for business purposes. The usage of the email continuously increases despite of other methods of communications. Automated management of emails is important in the today’s context as the volume of emails grows day by day. Out of the total emails, more than 55 percent is identified as spam. This shows that these spams consume email user time and resources generating no useful output. The spammers use developed and creative methods in order to fulfil their criminal activities using spam emails, Therefore, it is vital to understand different spam email classification techniques and their mechanism. This paper mainly focuses on the spam classification approached using machine learning algorithms. Furthermore, this study provides a comprehensive analysis and review of research done on different machine learning techniques and email features used in different Machine Learning approaches. Also provides future research directions and the challenges in the spam classification field that can be useful for future researchers.

Heart Disease Prediction using Hybrid machine Learning Model

Abstract: Heart disease causes a significant mortality rate around the world, and it has become a health threat for many people. Early prediction of heart disease may save many lives; detecting cardiovascular diseases like heart attacks, coronary artery diseases etc., is a critical challenge by the regular clinical data analysis. Machine learning (ML) can bring an effective solution for decision making and accurate predictions. The medical industry is showing enormous development in using machine learning techniques. In the proposed work, a novel machine learning approach is proposed to predict heart disease. The proposed study used the Cleveland heart disease dataset, and data mining techniques such as regression and classification are used. Machine learning techniques Random Forest and Decision Tree are applied. The novel technique of the machine learning model is designed. In implementation, 3 machine learning algorithms are used, they are 1. Random Forest, 2. Decision Tree and 3. Hybrid model (Hybrid of random forest and decision tree). Experimental results show an accuracy level of 88.7% through the heart disease prediction model with the hybrid model. The interface is designed to get the user’s input parameter to predict the heart disease, for which we used a hybrid model of Decision Tree and Random Forest

Heart Failure Prediction by Feature Ranking Analysis in Machine Learning

Abstract: Heart disease is one of the major cause of mortality in the world today. Prediction of cardiovascular disease is a critical challenge in the field of clinical data analysis. With the advanced development in machine learning (ML), artificial intelligence (AI) and data science has been shown to be effective in assisting in decision making and predictions from the large quantity of data produced by the healthcare industry. ML approaches has brought lot of improvements and broadens the study in medical field which recognizes patterns in the human body by using various algorithms and correlation techniques. One such reality is coronary heart disease, various studies gives impression into predicting heart disease with ML techniques. Initially ML was used to find degree of heart failure, but also used to identify significant features that affects the heart disease by using correlation techniques. There are many features/factors that lead to heart disease like age, blood pressure, sodium creatinine, ejection fraction etc. In this paper we propose a method to finding important features by applying machine learning techniques. The work is to design and develop prediction of heart disease by feature ranking machine learning. Hence ML has huge impact in saving lives and helping the doctors, widening the scope of research in actionable insights, drive complex decisions and to create innovative products for businesses to achieve key goals.

Design of face detection and recognition system to monitor students during online examinations using Machine Learning algorithms

Abstract: Today’s pandemic situation has transformed the way of educating a student. Education is undertaken remotely through online platforms. In addition to the way the online course contents and online teaching, it has also changed the way of assessments. In online education, monitoring the attendance of the students is very important as the presence of students is part of a good assessment for teaching and learning. Educational institutions have adopting online examination portals for the assessments of the students. These portals make use of face recognition techniques to monitor the activities of the students and identify the malpractice done by them. This is done by capturing the students’ activities through a web camera and analyzing their gestures and postures. Image processing algorithms are widely used in the literature to perform face recognition. Despite the progress made to improve the performance of face detection systems, there are issues such as variations in human facial appearance like varying lighting condition, noise in face images, scale, pose etc., that blocks the progress to reach human level accuracy. The aim of this study is to increase the accuracy of the existing face recognition systems by making use of SVM and Eigenface algorithms. In this project, an approach similar to Eigenface is used for extracting facial features through facial vectors and the datasets are trained using Support Vector Machine (SVM) algorithm to perform face classification and detection. This ensures that the face recognition can be faster and be used for online exam monitoring.


DHS Informatics believes in students’ stratification, we first brief the students about the technologies and type of Data Science projects and other domain projects. After complete concept explanation of the IEEE Data Science projects, students are allowed to choose more than one IEEE Data Science projects for functionality details. Even students can pick one project topic from Data Science and another two from other domains like Data Science,Data mining, image process, information forensic, big data, Data Mining, block chain etc. DHS Informatics is a pioneer institute in Bangalore / Bengaluru; we are supporting project works for other institute all over India. We are the leading final year project centre in Bangalore / Bengaluru and having office in five different main locations Jayanagar, Yelahanka, Vijayanagar, RT Nagar & Indiranagar.

We allow the ECE, CSE, ISE final year students to use the lab and assist them in project development work; even we encourage students to get their own idea to develop their final year projects for their college submission.

DHS Informatics first train students on project related topics then students are entering into practical sessions. We have well equipped lab set-up, experienced faculties those who are working in our client projects and friendly student coordinator to assist the students in their college project works.

We appreciated by students for our Latest IEEE projects & concepts on final year Data Mining projects for ECE, CSE, and ISE departments.

Latest IEEE 2021-2022 projects on Data Mining with real time concepts which are implemented using Java, MATLAB, and NS2 with innovative ideas. Final year students of computer Data Mining, computer science, information science, electronics and communication can contact our corporate office located at Jayanagar, Bangalore for Data Science project details.


Data Science is mining knowledge from data, Involving methods at the intersection of machine learning, statistics, and database systems. Its the powerful new technology with great potential to help companies focus on the most important information in their data warehouses. We have the best in class infrastructure, lab set up , Training facilities, And experienced research and development team for both educational and corporate sectors.

Data Science is the process of searching huge amount of data from different aspects and summarize it to useful information. Data Science is logical than physical subset. Our concerns usually implicate mining and text based classification on Data Science projects for Students.

The usages of variety of tools associated to data analysis for identifying relationships in data are the process for Data Science. Our concern support data mining projects for IT and CSE students to carry out their academic research projects.

Data Science is the process of searching huge amount of data from different aspects and summarize it to useful information. Data Science is logical than physical subset. Our concerns usually implicate mining and text based classification on data Science projects for Students. The usages of variety of tools associated to data analysis for identifying relationships in data are the process for data Science. Our concern support data Science projects for IT and CSE students to carry out their academic research projects.

Relational Statics

The popularity of the term “data science” has exploded in business environments and academia, as indicated by a jump in job openings. However, many critical academics and journalists see no distinction between data science and statistics. Writing in Forbes, Gil Press argues that data science is a buzzword without a clear definition and has simply replaced “business analytics” in contexts such as graduate degree programs.In the question-and-answer section of his keynote address at the Joint Statistical Meetings of American Statistical Association, noted applied statistician Nate Silver said, “I think data-scientist is a sexed up term for a statistician….Statistics is a branch of science. Data scientist is slightly redundant in some way and people shouldn’t berate the term statistician.”Similarly, in business sector, multiple researchers and analysts state that data scientists alone are far from being sufficient in granting companies a real competitive advantage and consider data scientists as only one of the four greater job families companies require to leverage big data effectively, namely: data analysts, data scientists, big data developers and big data engineers.

On the other hand, responses to criticism are as numerous. In a 2014 Wall Street Journal article, Irving Wladawsky-Berger compares the data science enthusiasm with the dawn of computer science. He argues data science, like any other interdisciplinary field, employs methodologies and practices from across the academia and industry, but then it will morph them into a new discipline. He brings to attention the sharp criticisms computer science, now a well respected academic discipline, had to once face.Likewise, NYU Stern’s Vasant Dhar, as do many other academic proponents of data science,argues more specifically in December 2013 that data science is different from the existing practice of data analysis across all disciplines, which focuses only on explaining data sets. Data science seeks actionable and consistent pattern for predictive uses.This practical engineering goal takes data science beyond traditional analytics. Now the data in those disciplines and applied fields that lacked solid theories, like health science and social science, could be sought and utilized to generate powerful predictive models.

Java Final year CSE projects in Bangalore

  • Java Information Forensic / Block Chain B.E Projects
  • Java  Cloud Computing B.E Projects
  • Java  Big Data with Hadoop B.E Projects
  • Java  Networking & Network Security B.E Pr ojects
  • Java  Data Mining / Web Mining / Cyber Secu rity B.E Projects
  • Java DataScience / Machine Learning  B.E Projects
  •  Java Artificaial Inteligence B.E Projects
  • Java  Wireless Sensor Network B.E Projects
  • Java  Distributed & Parallel Networking B.E Projects
  • Java Mobile Computing B.E Projects

Android Final year CSE projects in Bangalore

  • Android  GPS, GSM, Bluetooth & GPRS B.E Projects
  • Android  Embedded System Application Projetcs for B.E
  • Android  Database Applications Projects for B.E Students
  • Android  Cloud Computing Projects for Final Year B.E Students
  • Android  Surveillance Applications B.E Projects
  • Android  Medical Applications Projects for B.E

Embedded  Final year CSE projects in Bangalore

  • Embedded  Robotics Projects for M.tech Final Year Students
  • Embedded  IEEE Internet of Things Projects for B.E Students
  • Embedded   Raspberry PI Projects for B.E Final Year Students
  • Embedded  Automotive Projects for Final Year B.E Students
  • Embedded  Biomedical Projects for B.E Final Year Students
  • Embedded  Biometric Projects for B.E Final Year Students
  • Embedded  Security Projects for B.E Final Year

MatLab  Final year CSE projects in Bangalore

  • Matlab  Image Processing Projects for B.E Students
  • MatLab  Wireless Communication B.E Projects
  • MatLab  Communication Systems B.E Projects
  • MatLab  Power Electronics Projects for B.E Students
  • MatLab  Signal Processing Projects for B.E
  • MatLab  Geo Science & Remote Sensors B.E Projects
  • MatLab  Biomedical Projects for B.E Students

research papers on data science ieee

Data Science Journal

Press logo

  • Download PDF (English) XML (English)
  • Alt. Display
  • Collection: Data Management Planning across Disciplines and Infrastructures

Practice Papers

The research data management organiser (rdmo) – a strong community behind an established software for dmps and much more.

  • Ivonne Anders
  • Daniela Adele Hausen
  • Christin Henzen
  • Gerald Jagusch
  • Giacomo Lanza
  • Olaf Michaelis
  • Karsten Peters-von Gehlen
  • Torsten Rathmann
  • Jürgen Rohrwild
  • Sabine Schönau
  • Kerstin Vanessa Wedlich-Zachodin
  • Jürgen Windeck

This practice paper provides an overview of the Research Data Management Organiser (RDMO) software for data management planning and the RDMO community. It covers the background and history of RDMO as a funded project, its current status as a consortium and an open source software and provides insights into the organisation of the vibrant RDMO community. Furthermore, we introduce RDMO from a software developer’s perspective and outline, in detail, the current work in the different sub-working groups committed to developing DMP templates and related guidance materials.

  • Data management planning
  • DMP templates
  • Open source software
  • Community building

The Background and History of RDMO

The Research Data Management Organiser (RDMO) is a web-based software that enables research-performing institutions as well as researchers themselves to plan and carry out their management of research data. RDMO can assemble all relevant planning information and data management tasks across the whole life cycle of the research data. RDMO is ready for application in smaller or larger projects.

One of the results of ‘WissGrid’, a collaborative project in the German D-Grid context, was a small collection of guidelines on how to deal with data, including a set of questions to help organise data publication and data management ( Fiedler et al. 2013 ). The publication collected and reflected the discussion driven by the California Digital Library (CDL) and Digital Curation Centre (DCC) on data management and data management plans in the context of Germany and its landscape of research institutions and organisations.

Some of the takeaways from this work were that only writing up DMPs to meet the requirements of a funding agency would not suffice to guide research projects during their subsequent processes of producing and analysing their research data. DMPs and connected information should better remain in the realm of the workgroup/project or institution instead of a central website.

With this motivation, the DFG-founded RDMO project was set up in 2015 ( DFG 2021 ), aiming to develop a modern and easy-to-install and use web application with a questionnaire based on the aforementioned WissGrid guidelines ( Fiedler et al. 2013 ), a storage engine and configurable output options. The web app has consecutively been early-exposed to interested adopters. By giving extensive support, the RDMO project not only improved its web application but also was an attractor for the formation of new local groups to organise their data management work across institutional borders and local and community barriers.

The RDMO project continued in 2018 to interact intensely with the growing group of institutions and groups that used the RDMO web app in many different ways: not only to produce DMPs but also as a tool to organise consulting and coaching in data management, to enforce standardisation of data management within an institution, to feed available information (e.g., from lab instruments) into a project’s RDMO instance or to adapt the collected information into several formats required by funding agencies or research institutions. A DMP with these additional functionalities can also be used to initiate processes and tasks in the whole data lifecycle and is called ‘machine-actionable DMP’ (maDMP). In RDMO, we implemented the recommendations of the RDA WG DMP Common Standards ( Miksa et al. 2020 ).

RDMO from a Software Development’s Perspective

RDMO is an open source tool whose code can be freely extended and modified. It is implemented as a web application. It consists of a backend part running on a server that is mainly written in Python utilising the Django framework ( https://djangoproject.com/ ) and a frontend part based on common web technologies providing the user interface running in a browser to be able to provide a collaborative platform. Python and the Django framework were chosen because Python is a high-level programming language that is relatively easy to learn. Its emphasis is on code readability and usability, which has made it a well-established programming language in the science community. This provides the advantage of having a certain degree of knowledge in the area where RDMO is installed, maintained, used, and its development is driven forward. From the start, RDMO’s code has been freely available with an Apache 2.0 licence on GitHub, which also serves as a focal point for community feedback (bug reports, feature requests) and for defining and tracking RDMO’s future development ( https://github.com/rdmorganiser/rdmo ).

The software’s first release dates back to 2016. Subsequently, the RDMO community has seen over 60 new versions. Regular releases provide continuity and have made RDMO grow quite mature over time. The exact number of software downloads is unknown, but the number of productive and test instances has steadily been increasing during the last few years and has now reached 56 (source: https://rdmorganiser.github.io/Community/ , status: 11/09/2023).

RDMO was designed to make technical hurdles for administrators as low as possible. It can be installed fairly quickly and does not need much storage space or processing power because it primarily deals with textual data saved in rather small databases. RDMO only requires Python to run, a web server like Apache or Nginx to serve static files and a database like PostgreSQL or MySQL. There are Docker images provided as well to ease the RDMO run for those who are familiar with this technology.

Information is stored locally within an RDMO instance and is structured according to RDMO’s data model, presented in Figure 1 . A person compiling a DMP for a project is requested to address a series of questions. The answers are stored as values of internal variables called attributes and can then be further used to generate documents (views) or to activate actions (tasks).

The RDMO data model

The RDMO data model. RDMO employs a complex data model organised along different Django apps and modules (representing database tables), which is well documented ( https://rdmo.readthedocs.io/en/latest/management/data-model.html ).

The exchange of information among instances is made possible by using a common attribute list (the RDMO domain), which ensures compatibility between question catalogs and still allows use-case-tailored question catalogs, option sets and views. All this content can be exchanged over the GitHub repository for content.

The RDMO domain currently includes 291 hierarchically ordered attributes, which cover all RDM aspects identified so far and thus plays the role of a ‘controlled vocabulary’ for DMPs. The fundamental RDMO catalog contains 125 questions covering all aspects of research data management. Besides that, several other catalogs ( https://www.forschungsdaten.org/index.php/RDMO ) have been tailored to specific disciplines (engineering, chemistry, etc.), institutions (UARuhr, HeFDI) or funding programmes (SNF, Volkswagen Foundation), taking care to reuse as many questions and attributes from the main catalog and domain as possible to ensure interoperability between existing projects, ensuring that the very same attributes are referred to the questions in different catalogs (thus allowing users to switch catalogs when necessary). For example, there were successfully accompanied attribute supplements for DFG questionnaires from FoDaKo, a cooperation of the Universities of Wuppertal, Düsseldorf, and Siegen concerning research data management ( https://fodako.nrw/datenmanagementplan , see Figure 2 ), and the questionnaire of the University of Erlangen-Nuremberg for the Volkswagen Foundation, Germany’s largest private research sponsor. An implementation of the Horizon Europe Data Management Plan Template (for the homonym European funding framework programme) has also been added recently, comprising a questionnaire, new attributes and options, and a view (see Figure 3 ). Soon, the sub-working group will deal with other funding programmes from Germany and abroad, such as the Austrian funding organisation, Fonds zur Förderung der wissenschaftlichen Forschung (FWF)’ ( https://www.fwf.ac.at/ ).

Overview of the FoDaKo questionnaires for projects funded by DFG

Overview of the FoDaKo questionnaires for projects funded by DFG. All questionnaires fulfil the DFG checklist and have different subject-specific coverage, from the ‘minimum’/‘intersection’ catalog with 85 questions to the ‘maximum’/‘union’ catalog (an extension of the core RDMO catalog) with 139 questions. The subject-specific questionnaires include further recommendations from the DFG Review Board on that subject. ‘All questions’ is an extension of the catalog RDMO. Below the title, the number of questions is given.

Preview of the ready Horizon Europe Data Management Plan in the RDMO interface

Preview of the ready Horizon Europe Data Management Plan in the RDMO interface. Compared to the funders’ DMP templates, the questions in the RDMO catalogs are more precise and ‘fine-grained’. Filling out a DMP is further eased with the provision of help texts and controlled answer choices (options). Finally, export templates, i.e., views, are available for converting the data management plan into a deliverable, which inserts references to thematically overlapping questions and converts the data management plan into the deliverable form for the funder.

The RDMO Consortium

The RDMO consortium was founded in 2020 by signing a Memorandum of Understanding (MoU) ( https://rdmorganiser.github.io/docs/Memorandum-of-Understanding-RDMO.pdf ) between several supporting German institutions and individuals. The organisational structure with various groups has been approved by an RDMO user meeting. This structure supports future development and is detailed in the MoU. There are three permanent groups besides the general meeting of all members of the consortium, i.e., the signatories of the MoU. Members and other interested parties can participate in the general meeting. The general meeting meets at least once a year, as required. All institutions that are interested in the preservation and further development of RDMO are invited to sign the MoU.

Some of the members are active in various RDM working groups, such as RDA and DINI/nestor ( DINI/nestor-AG Forschungsdaten 2022 ), and thus ensure a user-oriented focus on the RDMO content through their external cooperation.

The RDMO Steering Group (StG)

The RDMO consortium is led by a steering group (StG). The representatives of the StG are elected by the members at the general meeting every three years or as needed. The StG accompanies direction of the further development and coordinates the processes for the further development of the software and its content. It is composed of at least five persons.

The RDMO Development Group (DG)

The technical coordination and further development of RDMO are organised by a development group. In addition to a core of long-term committed developers who continuously drive the development forward, the low-threshold participation of a larger number of developers is required and already in place. These, for example, can contribute to development on a project-specific basis.

The RDMO Content Group (CG)

The work of the CG members focuses on maintaining existing and newly generated content, such as attributes or questions for catalog templates. They provide moderation and support for individual processes, as well as domain adjustments. The CG collects user feedback from RDM coordinators and researchers from research institutions in Germany and checks the general usability of RDMO against the background of user feedback.

The work of the CG is currently organised into four sub-working groups and can spawn ad-hoc sub-working groups for special purposes.

Sub-Working Group Guidance Concepts and Texts

The ‘Practical Guide to the International Alignment of Data Management’ published by Science Europe ( 2021 ) provides specific guidance for different stakeholders, such as researchers and reviewers of DMPs, on how to manage research data, describe data management and review a DMP. The guide therefore comprises an overview of core aspects that should be included in a DMP. However, in such guidance documents, discipline-specific recommendations are often lacking. The sub-working group first collected discipline-specific best practices in data management. Based on this collection and findings, the most relevant DMP sections requiring recommendations were identified. For the structuring of a corresponding DMP guidance, the software design pattern concept was used in software engineering for the systematic description of problem-solution pairs ( Gamma et al. 2014 ). The pattern concept provides a template to store information, e.g., problems, solutions, concrete examples and related patterns. A specific DMP guidance template was developed by extending the initial pattern template. The use of such a pattern structure for DMP guidance ensures that recommendations/guidance can be easily compared and linked. Moreover, the pattern structure can help raise awareness of the potential consequences of not implementing proper data management through concrete solutions. As a proof-of-concept and first collection of guidance patterns, examples were selected from the own RDM support experiences for research projects with different disciplinary foci and iteratively improved the template ( Henzen et al. 2022 ). The DMP guidance pattern structure can be applied to other DMP guidance texts and extended accordingly.

In the future, the working group will further elaborate on how to streamline our DMP pattern concept with RDM community activities, like the Stamp project (Standardised data management plan for education sciences; Netscher et al. 2022 ) or the activities of the RDA working group ‘Discipline-specific Guidance for Data Management Plans’ ( https://rd-alliance.org/groups/discipline-specific-guidance-data-management-plans-wg ). Moreover, they are going to implement the envisioned community-driven guidance pattern collection process, e.g., by guiding RDM support teams and researchers to collect further patterns and provide guidance on how to use the pattern collection. On a practical level, they aim to provide a basic set of patterns for the RDMO community to be used in upcoming and existing DMP templates. However, the group envisioned the applicability and usage of the patterns across disciplines and tools, not limited to their usage in RDMO.

Sub-Working Group Editorial Processes

The sub-working group called Editorial Processes is responsible for the development, curation and harmonisation of the content that is necessary for the local usage of an RDMO instance: attributes, catalogs, conditions, option sets and views.

External authors have the option to make their questionnaires available to the general public in the ‘shared’ area of the RDMO repository for content ( https://github.com/rdmorganiser/rdmo-catalog ). Editorial Processes also accompanies the content development by external authors, cares for its harmonisation and adds the newly created attributes and questions whenever they can be of general relevance. Besides that, this sub-working group has coordinated the localisation of the RDMO software and of the RDMO content into French, Spanish and Italian, yielding a total of five languages.

Sub-Working Group Website

The transition of RDMO towards a community-based project required the website ( https://rdmorganiser.github.io/ ) to reflect the change from a project to a community as well. This sub-working group is engaged in the improvement of the online representation of RDMO, tailoring the information for the different audiences, including end users (researchers), RDM managers/coordinators and system administrators presenting various aids. The focus is on providing informational material that is relevant, depending on the needs of the audience.

The website intends to be the first point of contact for RDMO users or interested parties and to bring together all the available information about RDMO.

Sub-Working Group DFG Checklists

This sub-working group is working on the implementation of the Deutsche Forschungsgemeinschaft (DFG) guidelines for research data management in RDMO. These guidelines must be considered during the redaction of project proposals and are available as a checklist ( http://www.dfg.de/research_data/checklist ). Since spring 2022, many German universities have developed guidelines, commentated versions of the DFG checklist or specific RDMO questionnaires to support their local researchers. The sub-working group was established in October 2022 to harmonise and map local solutions, creating one community questionnaire and export template.

Conclusion and Outlook

The overall goals of the work of the RDMO consortium are to simplify RDM and DMP planning further for users, improve their experience and build a sustainable open source community. With the user perspective in mind, the focus is, therefore, particularly on motivating researchers to use RDMO for their purposes. One of the ways by which the consortium intends to achieve this is by expanding different RDMO catalogs for various purposes (e.g., additional benefits such as project management functions and exchange between the different researchers in the project) by using DMPs. Researchers can be motivated in this respect, not only by familiarising them with RDMO but also by involving them in developing questionnaires that can be tailored to their discipline and/or to the needs of their community.

The development of several RDM initiatives, including the German National Research Data Infrastructure (NFDI, https://www.nfdi.de/consortia/ ), gives great momentum to the discussion around DMPs and facilitates the harmonisation and establishment of common infrastructures. In the coming years, it is expected that the importance of research data and corresponding data management will continue to increase enormously. This will also give rise to further environments and tools that facilitate RDM. Due to its strong community, RDMO has the possibility to offer a significant contribution to innovative and demand-oriented research data management.


The authors express their gratitude to the entire RDMO community for all the work, the discussions and the results reached.

Funding Information

The authors would like to thank the Federal Government of Germany and the Heads of Government of the Länder, as well as the Joint Science Conference (GWK) and the German Research Foundation (DFG) through the projects NFDI4Ing (project number 442146713) and NFDI4Earth (project no.460036893) for their funding.

Competing Interests

The authors have no competing interest to declare.

Fiedler, N, et al. 2013. Leitfaden zum Forschungsdaten-Management: Handreichungen aus dem WissGrid-Projekt . Verlag Werner Hülsbusch. https://publications.goettingen-research-online.de/handle/2/14366 .  

Gamma, E, et al. 2014. Design patterns:elements of reusable object-oriented software . Boston, MA: Addison-Wesley. https://openlibrary.telkomuniversity.ac.id/pustaka/37782/design-patterns-elements-of-reusable-object-oriented-software.html .  

Henzen, C, et al. 2022. A Community-driven collection and template for DMP guidance facilitating data management across disciplines and funding. Zenodo . DOI: https://doi.org/10.5281/zenodo.6966878  

Miksa, T, Walk, P and Neish, P 2020. RDA DMP common standard for machine-actionable data management plans. Zenodo . DOI: https://doi.org/10.15497/rda00039  

Netscher, S, et al. 2022. Stamp—Standardisierter Datenmanagementplan für die Bildungsforschung. Zenodo . DOI: https://doi.org/10.5281/zenodo.6782478  

Science Europe 2021. Practical guide to the international alignment of research data management—extended edition. Zenodo . DOI: https://doi.org/10.5281/zenodo.4915862  

https://dini.de/ag/dininestor-ag-forschungsdaten/ .  

https://gepris.dfg.de/gepris/projekt/270561467 .  

Call for Reviewers for 3rd IEEE CVMI 2024 Conference

Welcome to 3rd ieee cvmi 2024 , the cvmi 2022 proceedings published online here ., the cvmi 2022 program schedule is available here ., iapr best paper award (student): manali roy (indian institute of technology (ism), dhanbad, india) [paper id: 176], iapr best paper award (professional): akhilesh kumar (defence institute of psychological research (dipr), drdo, india) [paper id: 200], cvmi best paper award (computer vision): anmol gautam (national institute of technology, meghalaya, india) [paper id: 61], cvmi best paper award (machine intelligence): shajahan aboobacker (national institute of technology, karnataka, india) [paper id: 129], cvmi best phd thesis award: dr. koyel mandal (tezpur university, india).

research papers on data science ieee

Organizing Knowledge Partner Research Labs

research papers on data science ieee

The CVMI 2022 conference proceedings will be published by Springer.


The CVMI 2022 conference is endorsed by the International Association for Pattern Recognition "IAPR".


The CVMI 2022 conference is Technically Sponsored by IEEE Signal Processing Society UP Chapter.


About IEEE CVMI 2024

The IEEE CVMI 2024 conference is financially and technically sponsored by IEEE Uttar Pradesh Section. The CVMI 2024 conference is "Endorsed by International Association for Pattern Recognition (IAPR)" and "Technically Sponsored by IEEE SPS UP Chapter".

The conference programme will include regular paper presentations, along with keynote talks by prominent expert speakers in the field. All submitted papers will be double-blind peer-reviewed. Paper acceptance will be based on originality, significance, technical soundness, and clarity of presentation. The IAPR Best Paper Award and the CVMI-2024 Best Paper Awards will be given to the outstanding papers. The Best PhD Dissertation Awards will also be given in the PhD Symposium during IEEE CVMI 2024.

  • Successfully presented papers will be submitted to IEEE Xplore for publication.
  • Sponsored by IEEE Uttar Pradesh Section.
  • Endorsed by IAPR.
  • Technically Sponsored by IEEE Signal Processing Society Uttar Pradesh Chapter.
  • Indexed by Scopus and DBLP.
  • IAPR Best Paper Award and CVMI-2024 Best Paper Awards.
  • Best PhD Dissertation Awards.

Prayagraj Attractions:

research papers on data science ieee

IIIT Allahabad, Jhalwa, Prayagraj, Uttar Pradesh, India

research papers on data science ieee

Cabs are available from both Railway Station and Airport on Availability.

research papers on data science ieee

Stay can be availed in the Visitors Hostels of the Institute on Availability.

research papers on data science ieee

Food will be available through out the event


Computer Science Student and Professor at University of Puget Sound Win Best Paper at Big Data Conference

Julia Kaeppel ’24 and Prof. David Chiu published their research on database cache management.

University of Puget Sound student Julia Kaeppel ’24 has always been interested in computer programming. As a kid, she was a member of her elementary school robotics team and got hooked on programming in middle school as a pathway toward making video games. Kaeppel’s lifelong interest in operating systems and programming later led to an exciting research opportunity at Puget Sound. As a rising junior, Kaeppel approached Professor of Computer Science David Chiu about the possibility of working on a summer research project He immediately had an idea for an impactful project they could tackle.

Julia Keppel stands next to Prof. David Chiu in front of stacked computers with wiring sticking out.

“I’ve been working on this database project for over a decade and I had an idea of where I wanted to go next with the research. It was just a matter of finding the right student because it required a unique skill set,” says Chiu. “That Julia is such a strong C programmer with the right skill set and an interest in operating systems and performance was pretty fortuitous.”

Databases often contain immense amounts of raw data. It takes a long time to search through all that data to find a given piece of information, so computers use caches to store previous results for reuse. Caches serve as shortcuts to get at relevant information quickly, but they have limited space. Chiu wanted to find the optimal sequence in which to dispatch the queries as well as the order in which to evict older cached results in an effort to improve query performance. That’s where Kaeppel’s research came in. With funding from a McCormick Summer Research grant, she was able to dive into the problem and spend 10 weeks trying to find a solution.

“Over the summer, we developed a couple of algorithms for reordering bitmap queries and we found that a lot of them didn’t work,” Kaeppel says. “However, we found that ordering queries by size from shortest to longest provided the greatest optimization. There’s an elegant simplicity to it.” 

The result is deceptively simple, but could be proven mathematically to maximize the number of times queries could be reused over time. Chiu and Kaeppel described their research in a paper that was accepted for publication at the 10th Association for Computing Machinery (ACM) and Institute of Electrical and Electronics Engineers (IEEE) International Conference on Big Data Computing, Applications, and Technologies, where it won the award for best paper.

“This conference only accepts 25 to 30% of all papers submitted. So, to be accepted and then to win a best paper award is a major accomplishment,” Chiu says. “Julia isn’t a Ph.D. student. She’s an undergrad—and yet her work beat out every other paper at the conference. It’s unprecedented in my research group.” 

Kaeppel credits Professor Chiu’s mentorship for helping her develop the tools she needed to tackle the research project—and for opening her eyes to the possibility of doing more research after graduating from Puget Sound.

“It was a great experience and definitely broadened my horizons. Even if I don’t go into academic research, I could see myself pursuing a career working with algorithms and optimizations,” Kaeppel says.

“Julia was a joy to work with. She is dedicated and has an intuition on problem solving that makes her a very natural researcher,” Chiu adds. “When we got a result that didn’t look right, she knew where to dig for answers. That’s not something I would typically expect from an undergrad. Julia was already at that level—and that made all the difference in making this publication possible.”

Hands holding an Asian decoration

Making Space

Hands of two students participating with the FEPPS program.

Breaking Down the Walls


McCarver Day


Team Beaver


Spreading the Aloha Spirit



A masked male teacher speaks to students with windows behind him

University of Puget Sound Launches New Crime, Law, and Justice Studies Minor

Professor Lynnette Claire teaches her business class on the steps of Jones Hall.

University of Puget Sound Alumnus Offers Students Valuable Internship Experience

Kohlrabi grows in a local community garden.

University of Puget Sound Student Studies the Social Impact of Community Gardens

Learning From Destruction Banner

Learning From Destruction

© 2024 University of Puget Sound

Evaluating the merits and constraints of cryptography-steganography fusion: a systematic analysis

  • Regular Contribution
  • Open access
  • Published: 05 May 2024

Cite this article

You have full access to this open access article

research papers on data science ieee

  • Indy Haverkamp 1 &
  • Dipti K. Sarmah 1  

135 Accesses

Explore all metrics

In today's interconnected world, safeguarding digital data's confidentiality and security is crucial. Cryptography and steganography are two primary methods used for information security. While these methods have diverse applications, there is ongoing exploration into the potential benefits of merging them. This review focuses on journal articles from 2010 onwards and conference papers from 2018 onwards that integrate steganography and cryptography in practical applications. The results are gathered through different databases like Scopus, IEEE, and Web of Science. Our approach involves gaining insights into real-world applications explored in the existing literature and categorizing them based on domains and technological areas. Furthermore, we comprehensively analyze the advantages and limitations associated with these implementations, examining them from three evaluation perspectives: security, performance, and user experience. This categorization offers guidance for future research in unexplored areas, while the evaluation perspectives provide essential considerations for analyzing real-world implementations.

Similar content being viewed by others

research papers on data science ieee

A Systematic Review of Highly Transparent Steganographic Methods for the Digital Audio

research papers on data science ieee

A Systematic Review of Computational Image Steganography Approaches

research papers on data science ieee

Steganography and Steganalysis (in digital forensics): a Cybersecurity guide

Avoid common mistakes on your manuscript.

1 Introduction

Our daily lives are becoming increasingly linked with the digital realm, encompassing various activities such as messaging, cloud data storage, and financial transactions. Ensuring the security and confidentiality of this data is vital. Cryptography and steganography, two essential sciences of information security [ 74 , 77 ], offer solutions to render messages unintelligible to eavesdroppers and imperceptible to detection, respectively. These techniques play a crucial role in protecting sensitive information. Both fields serve the purpose of ensuring the confidentiality of data [ 69 ], however, in different ways: Cryptography shields the content of a message through the use of encryption keys, ensuring its protection. On the other hand, steganography focuses on concealing the very presence of the message within a "cover" medium [ 74 ]. While cryptography finds extensive usage in various everyday applications, both techniques have their respective domains of application and can potentially be combined for enhanced security measures. Steganography encompasses a wide range of techniques and can be applied in different forms, such as images, audio, video, and text, to many applications, for example, IoT communication [ 7 , 21 , 39 ], military [ 71 ], cloud storage [ 2 , 18 , 46 , 67 ], and more [ 28 , 31 , 32 , 89 , 93 ]. The growth of interest in steganography was sparked in two ways: the multimedia industry could greatly benefit from possible water-marking techniques, and restrictions on cryptographic schemes by governments triggered interest in alternative ways for communication to stay secretive [ 8 ] (Fig. 1 ).

figure 1

Graph of published journal articles and conference papers on Scopus ( https://www.scopus.com/ —with query: ("cryptography" AND "steganography") AND ("application" OR "real-world") AND ("security" OR "cyberattack" OR "cybersecurity")) from 1996 to June 2023

Figure 2 visually represents the exponential growth of publications focusing on the applications of combining steganography and cryptography, as observed in Scopus. Footnote 1 This trend highlights the increasing interest in merging or comparing these two disciplines within the research community. While the combination of multiple security mechanisms may appear advantageous, it is important to note that the suitability of combining cryptography with steganography can vary. Several factors, including bandwidth availability [ 37 , 81 ] and latency considerations [ 88 ], can influence the feasibility of such integration. For instance, incorporating additional layers of security may result in increased data size, potentially exceeding the available bandwidth and causing slower transmission speeds. Interestingly, the computational complexity of a combined approach does not always exhibit a linear increase. A notable example is presented in [ 25 ], where steganography with Diffie-Hellman encryption demonstrated the same time complexity as steganography alone. However, when using RSA [ 91 ] encryption, a higher time complexity was observed [ 25 ]. Therefore, the choice between these techniques heavily relies on the specific security requirements of the given situation and the particular types of cryptography and steganography employed. In this paper, we refer to "a combined approach" to the combined use of steganography and cryptography. Furthermore, ’method’ and’scheme’ interchangeably refer to a paper’s combined implementation.

figure 2

Data gathering and study selection processes of both literature searches

As the number of systems requiring protection from cyberattacks continues to rise, the exploration of applications where steganography and cryptography can be combined becomes increasingly intriguing. Nonetheless, to identify potential areas for improvement or future research, it is imperative to gain a comprehensive understanding of the current state of research in this field.

The goal of this research is in threefold steps as mentioned in the following. This also helps to formulate the research questions.

The research does a systematic literature review aiming to bring forth a novel perspective by identifying and analyzing papers that delve into the combined application of cryptography and steganography across real-world applications.

The research categorizes these applications based on diverse domains and contexts, such as their domain of applications (e.g., Medical or Transportation) and technological domain (e.g., Cloud Computing or Internet of Things).

The research also explores several relevant studies to identify the advantages, limitations, and trade-offs discussed in the existing literature and gain insight into how the performance of these combined implementations can be effectively analyzed.

The findings derived from this comprehensive review yield valuable insights into the current research landscape, contributing to advancements in fortifying systems against cyber threats. Consequently, these findings prompt the formulation of the following research questions, which further drive exploration and inquiry in this field. The primary research question focuses on exploring the advantages and limitations of utilizing a combined steganography and cryptography approach in diverse real-world applications as a means to enhance security against cyberattacks on a system.

To address this primary question, three key sub-questions necessitate analysis:

What are the various real-world applications where combined steganography and cryptography approaches can be used? (RQ1)

What are the advantages, limitations, and trade-offs of using a combined approach in these applications? (RQ2)

How are implementations of a combined approach evaluated across different real-world applications? (RQ3)

By addressing these sub-questions, a comprehensive understanding of the benefits, constraints, and evaluation methods surrounding the combined application of steganography and cryptography can be obtained, leading to significant insights for bolstering system security against cyber threats.

This paper is organized into several sections, including the Introduction section as referred to in Sect. 1 . Section 2 discusses the background and related work of the steganography and cryptography techniques as well as evaluation methods. Section 3 elaborates on the methodology of the research including the search strategy for conducting a literature review, databases to collect resources, and tools to optimize the efficiency of the review process. The results are presented in Sect. 4 which includes different types of applications and categorization approaches of these applications, exploring limitations and advantages of the applications, and analyzing these methods to provide valuable insights into the combination of cryptography and steganography methods in terms of security, performance, and user perspective. Section 5 gives the concluding remarks and presents the future scope of the research. References are drawn at the end of the paper.

2 Background and related work

There is high interest in organizations, researchers, and end users in the sciences of steganography and cryptography to enhance security for different applications and several domains. In this research, we analyzed several papers that focus on the combination of cryptography and steganography to identify the real gap and pros and cons of combining both sciences. For that, we focused on several relevant applications, and one of the important and interesting applications in the medical domain where Bhardwaj, R. [ 13 ] addresses the critical challenge of ensuring patient data privacy and security in telemedicine applications. The author proposes an enhanced reversible data-hiding technique operating in the encrypted domain. The proposed algorithm embeds secret messages in hexadecimal form and utilizes four binary bits of electronic patient information (EPI) in each cover image block, ensuring secure communication. However, this approach mitigates underflow and overflow problems, enabling precise information embedding even in low-intensity pixel areas.

On the other side, the research [ 22 ] discusses the growing challenge of securing medical data in healthcare applications due to the expanding presence of the Internet of Things (IoT). They propose a hybrid security model combining cryptography and steganography to protect diagnostic text data within medical images. The encryption process precedes the embedding of encrypted data into cover images, both color and grayscale, to accommodate varying text sizes. Performance evaluation based on six statistical parameters indicates promising results, with PSNR values ranging from 50.59 to 57.44 for color images and from 50.52 to 56.09 for grayscale images. The proposed model demonstrates its effectiveness in securely hiding confidential patient data within cover images while maintaining high imperceptibility and capacity, with minimal degradation in the received stego-image.

Further, the research [ 34 ] states that as the elderly population increases and more people suffer from heart problems, hospitals worldwide are expected to use remote electrocardiogram (ECG) patient monitoring systems. These systems will gather a lot of ECG signals from patients at home, along with other health measurements like blood pressure and temperature, for analysis by remote monitoring systems. It's crucial to keep patient information safe when transmitting data over public networks and storing it on hospital servers. This study introduces a technique using wavelets, which are like a special math tool, to hide patient data in ECG signals. It combines encryption, which is like a lock, and scrambling, which is like mixing things up, to protect the data. This technique lets us put patient info into ECG signals without changing how they look or work. Tests show that the technique keeps data safe (with less than 1% change) and doesn't mess up the ECG readings. This means doctors can still read the ECGs even after we take out the hidden patient info, keeping medical data private and accurate.

Furthermore, the paper [ 41 ] proposes a novel steganography technique in their work, aiming to enhance the security of data transmission in telemedicine applications. This technique involves concealing patient information within medical images using a dynamically generated key, determined through graph coloring and the pixel count of the cover image. By combining steganography with cryptography, the patient information is encrypted using the RSA algorithm to strengthen security measures. Notably, this proposed method ensures reversibility, allowing for the lossless restoration of original medical images after data extraction from the stego medical image. Experimental evaluations demonstrate the efficacy of this approach, showcasing its superior security compared to alternative information hiding methods, particularly in terms of key generation complexity and the quality of restored images as measured by Peak Signal-to-Noise Ratio (PSNR) and Mean Square Error (MSE).

The researchers in [ 45 ] also worked along similar lines to enhance robust security measures in the handling of medical images, particularly when sensitive patient records are involved. To address this, a 128-bit secret key is generated based on the histogram of the medical image. Initially, the digital imaging and communications in medicine (DICOM) image undergoes a decomposition process to extract its sensitive features. The resulting image is then divided into blocks dependent on the generated key, followed by key-dependent diffusion and substitution processes. Encryption is performed over five rounds to ensure robust security. Subsequently, the secret key is embedded within the encrypted image using steganography, further enhancing the security of the proposed cipher. At the receiver's end, the secret key is extracted from the embedded image and decryption is carried out in reverse.

An innovative approach is presented in this paper [ 48 ], proposing an integrated method that combines cryptography and steganography to bolster data security in automotive applications. In this technique, data is first encrypted using a modified RSA cryptographic algorithm, and the encrypted data is then embedded along the edges of an image using the Least Significant Bit (LSB) technique. Edge detection [ 55 ] is accomplished using a fuzzy logic approach. This integrated approach is primarily designed for applications such as Diagnostics over Internet Protocol (DoIP) and Software Updates over the Air (SOTA), which involve the exchange of highly sensitive data. Additionally, the authenticity of the source of software updates is verified using a Hash Algorithm in SOTA.

Additionally, this paper [ 57 ] introduces a technique for encrypting and decrypting patient medical details, medical images, text, and pictorial forms using unique algorithms, aligning with the literature discussed above in the medical field. However, this research enhances security through the utilization of chaotic signals [ 59 ]. Signal generation and analysis are conducted using Matlab 7.10, demonstrating the efficacy of this method. In similar lines, the paper by Parah et al. [ 61 ] introduces a novel and reversible data hiding scheme tailored for e-healthcare applications, emphasizing high capacity and security. The Pixel to Block (PTB) conversion technique is employed to generate cover images efficiently, ensuring the reversibility of medical images without the need for interpolation. To enable tamper detection and content authentication at the receiver, a fragile watermark and Block Checksum are embedded in the cover image, computed for each 4 × 4 block. Intermediate Significant Bit Substitution (ISBS) is utilized to embed Electronic Patient Records (EPR), watermark, and checksum data, preventing LSB removal/replacement attacks. Evaluation of the scheme includes perceptual imperceptibility and tamper detection capability under various image processing and geometric attacks. Experimental results demonstrate the scheme's reversibility, high-quality watermarked images, and effective tamper detection and localization.

In this research [ 75 ], the authors propose a new and secure steganography-based End-to-End (E2E) verifiable online voting system to address issues within the voting process. This research introduces a novel approach to online voting by integrating visual cryptography with image steganography to bolster system security while maintaining system usability and performance. The voting system incorporates a password-hashed-based scheme and threshold decryption scheme for added security measures. Further, the research [ 78 ] discusses the advantages of combining both steganography and cryptography for having more secure communication. Initially, the Advanced Encryption Standard (AES) algorithm is adapted and employed to encrypt the secret message. Subsequently, the encrypted message is concealed using a steganography method. This hybrid technique ensures dual-layered security, offering both high embedding capacity and quality stego images for enhanced data protection.

Furthermore, the authors in [ 87 ] introduce a novel Reversible data hiding in encrypted images (RDHEI) scheme leveraging the median edge detector (MED) and a hierarchical block variable length coding (HBVLC) technique. In this approach, the image owner predicts the pixel values of the carrier image with MED, followed by slicing the prediction error array into bit-planes and encoding them plane by plane. Experimental results demonstrate that the proposed scheme not only restores secret data and the carrier image without loss but also surpasses state-of-the-art methods in embedding rate across images with diverse features.

The previously discussed paper primarily centered around application domains. In contrast, we examined several papers that primarily focused on technological domains. This paper [ 1 ] presents the Circle Search Optimization with Deep Learning Enabled Secure UAV Classification (CSODL-SUAVC) model tailored for Industry 4.0 environments. The CSODL-SUAVC model aims to achieve two core objectives: secure communication via image steganography and image classification. The proposed methodology involves Multi-Level Discrete Wavelet Transformation (ML-DWT), CSO-related Optimal Pixel Selection (CSO-OPS), and signcryption-based encryption. The proposed CSODL-SUAVC model is experimentally validated using benchmark datasets, demonstrating superior performance compared to recent approaches across various evaluation aspects.

In their paper [ 5 ], the authors introduce an improved system designed to safeguard sensitive text data on personal computers by combining cryptography and steganography techniques. The system's security is fortified by employing RSA cryptography followed by audio-based steganography. The study includes system modeling and implementation for testing, aimed at exploring the relationship between security, capacity, and data dependency. Experimentation involved securing data within 15 differently sized audio files, yielding insightful results.

Additionally, the research in [ 7 ] discusses the promising growth of the Internet of Things (IoT) and the prevalent use of digital images, which has resulted in an increased adoption of image steganography and cryptography. However, current systems encounter challenges related to security, imperceptibility, and capacity. In response, they propose a new Crypt-steganography scheme that integrates three primary elements: hybrid additive cryptography (HAC) for secure message encryption, the bit interchange method (BIGM) to ensure imperceptibility during embedding, and a novel image partitioning method (IPM) for enhanced randomness in pixel selection. Evaluations confirm the scheme's effectiveness in addressing these challenges.

Also, the authors of [ 9 ] presented a novel approach to safeguard data on the cloud using Reversible Data Hiding in an Encrypted Image (RDHEI), coupled with homomorphic encryption and a rhombus pattern prediction scheme. With this method, third parties can perform data-hiding operations on encrypted images without knowledge of the original content, ensuring high-level security. The proposed method demonstrates strong protective measures, as evidenced by experimentations. Additionally, the approach enables seamless image recovery and covert extraction.

Further, in this paper, the authors [ 10 ], explore how malicious Android applications are evading detection by hiding within images using techniques like Concatenation, Obfuscation, Cryptography, and Steganography. They assess the vulnerability of ten popular Android anti-malware solutions to these methods. Surprisingly, only one solution detected two hiding techniques, while the others remained blind to all eight. This evaluation offers insights into the evolving landscape of Android malware and the effectiveness of current detection systems.

Insufficient security measures in data transmission led to issues like data integrity, confidentiality, and loss, especially with big data. Executing multiple security algorithms reduces throughput and increases security overhead, impacting robustness against data loss. Conversely, compression techniques sacrifice data confidentiality. Existing studies lack comprehensive security policies to address these concerns collectively. Therefore, the authors in their paper [ 14 ] propose an integrated approach to enhance confidentiality and provide backup for accidental data loss by combining simplified data encryption standards (SDES) and advanced pattern generation. A novel error control technique maximizes data integrity against transmission errors. A new compression method improves robustness against data loss while maintaining efficiency. Enhanced confidentiality and integrity are achieved through advanced audio steganography. Implementing this integrated technique in a GPU environment accelerates execution speed and reduces complexity. Experiments validate the method's effectiveness in ensuring data confidentiality and integrity, outperforming contemporary approaches.

By covering one more technological aspect, the research [ 19 ] introduces a secure Near Field Communication (NFC) smartphone access system using digital keys and Encrypted Steganography Graphical Passwords (ESGP). User perceptions and intentions are evaluated through experiments and surveys, emphasizing security as a key factor in adopting NFC ESGP systems. This offers valuable insights for enhancing security through two-factor authentication on NFC-enabled smartphones.

Further, recognizing Fog computing as an intriguing domain bridging the cloud and Internet of Things (IoT) necessitates a secure communication channel to prevent attacks. In the paper [ 33 ], the strengths and weaknesses of hybrid strategies of cryptography and steganography in fog environments, where real-time transmission is crucial, are discussed. This paper presents a novel Fog-Based Security Strategy (FBS2) that integrates cryptography and steganography techniques. The Cryptography Technique (PCT) entails two phases: confusion and diffusion, which scramble and modify pixel values of secret images using innovative methodologies. The Steganography Technique utilizes discrete wavelet packet transform, employing a new matching procedure based on the most significant bits of encrypted secret images and cover image pixels. Experimental results illustrate FBS2's superiority in efficiency, security, and processing time, executing it well-suited for fog environments.

Furthermore, the paper [ 36 ] explores Industry 5.0, which merges human and machine capabilities to meet complex manufacturing demands through optimized robotized processes. Industry 5.0 utilizes collaborative robots (cobots) for improved productivity and safety, while unmanned aerial vehicles (UAVs) are expected to have a significant role. Despite UAVs' advantages like mobility and energy efficiency, challenges such as security and reliability persist. To address this, the article presents AIUAV-SCC, an artificial intelligence-based framework tailored for Industry 5.0. It consists of two main phases: image steganography-based secure communication and deep learning (DL)-based classification. Initially, a new image steganography technique is employed, integrating multilevel discrete wavelet transformation, quantum bacterial colony optimization, and encryption processes.

Another interesting method proposed in the research [ 85 ] for encrypting digital images using a special type of mathematical system called a chaotic system. Chaotic systems have properties that make them very difficult to predict and control, which is useful for encryption. The method proposed in this paper uses a specific type of chaotic system called the two-dimensional Hénon-Sine map (2D-HSM), which has been designed to be more effective than other chaotic systems for this purpose. Additionally, the method incorporates a technique inspired by DNA to further enhance the encryption process. This new encryption scheme aims to protect images when they are sent over the Internet. The paper presents experimental tests to show that this scheme performs better than other methods in terms of security and resistance to attacks.

Furthermore, advanced cloud computing is considered one of the prominent technologies offering cost-saving benefits and flexible services. With the increasing volume of multimedia data, many data owners choose to outsource their data to the cloud. However, this trend raises privacy concerns, as users relinquish control over their data. To address these concerns, reversible data hiding schemes for encrypted image data in cloud computing have been proposed by [ 86 ]. These schemes aim to ensure data security without relying on the trustworthiness of cloud servers. Theoretical analysis confirms the security and correctness of the proposed encryption model, with acceptable computation costs adjustable based on security needs.

We also focused on Conference papers that explore the combination of cryptography and steganography, covering various applications and technological domains. The work at [ 23 ], presents a novel framework that combines a hybrid encryption scheme using chaotic maps and 2D Discrete Wavelet Transform (DWT) Steganography to enhance security by maintaining patient privacy. Additionally, a web-based monitoring platform is deployed for tracking electronic medical records during transmission. Experimental results show that the proposed framework outperforms existing approaches in terms of accuracy, sensitivity, and perceptibility, with high imperceptibility and limited degradation in the stego image.

Along similar lines, the authors [ 29 ] present the aim of their study to protect the privacy and confidentiality of data during multimedia exchanges between two IoT hops in uncertain environments. To achieve this, a robust multilevel security approach based on information hiding and cryptography is proposed to deter attackers and ensure data confidentiality. Existing schemes often struggle to strike a balance between medical image quality and security, and directly embedding secret data into images followed by encryption can make it easy for intruders to detect and extract hidden information. This study yields superior results in terms of imperceptibility and security by employing the right method in the right context.

Also, another application aspect Reversible data hiding (RDH) ensures secure digital data transmission, especially vital in telemedicine where medical images and electronic patient records (EPR) are exchanged. This study [ 47 ] proposes a novel RDH scheme that embeds EPR data during image encryption. Using a block-wise encryption technique, the scheme hides EPR data bits within the encrypted image. A support vector machine (SVM)-based classification scheme is employed for data extraction and image recovery. Experimental results show superior performance compared to existing schemes in terms of embedding rate and bit error rate.

Further, Network security is crucial in safeguarding against malicious attacks, especially with the rapid growth of e-commerce worldwide. This study [ 52 ] proposes a novel approach to enhance online shopping security by minimizing the sharing of sensitive customer data during fund transfers. Combining text-based steganography, visual cryptography, and OTP (One Time Password), the proposed payment system ensures customer data privacy, prevents identity theft, and increases customer confidence. By utilizing steganography and visual cryptography, the method minimizes information sharing between consumers and online merchants, thereby enhancing data security and preventing misuse of information.

Further moving forward with another interesting research [ 62 ] that focusses on E-commerce platform transactions. this study proposes a two-layered security mechanism for e-transactions using dynamic QR codes. The first layer involves encapsulating payment information within a dynamic QR code, unique to each order, which includes bank details, user information, and order specifics. The second layer employs encryption through Secure Electronic Transactions (SET) to further secure the payment process. This dual-layer approach enhances security by introducing dynamic QR codes, reducing vulnerability to cyber-attacks and ensuring secure transmission of payment data. On the other side, the authors [ 3 ] proposed a lightweight dynamic encryption algorithm using DNA-based stream ciphers. This algorithm generates a one-time dynamic key (DLFSR) based on collected data, encoding both the text and key into a dynamic DNA format. The ciphertext is then produced through an addition process using a proposed table, with decryption information hidden within for key distribution. Statistical tests and performance evaluations demonstrate the algorithm's effectiveness in providing security for restricted devices, outperforming previous approaches.

To safeguard IPv6 packet identities against Denial-of-Service (DoS) attacks, this paper [ 6 ] proposes a combination of cryptography and steganography methods. Ensuring secure communication in IPv6 network applications is crucial due to prevalent issues like DoS attacks and IP spoofing. The proposed approach involves generating unique identities for each node, encrypting them, and embedding them into transmitted packets. Upon reception, packets are verified to authenticate the source before processing. The paper conducts nine experiments to evaluate the proposed scheme, which includes creating IPv6 addresses, applying logistics mapping, RSA encryption, and SHA2 authentication. Network performance is assessed using OPNET modular, demonstrating improved computation power consumption and better overall results, including memory usage, packet loss, and traffic throughput. In a similar line, the paper [ 11 ] suggests a hybrid security method using hashing, encryption, and image steganography to better protect user credentials in databases. The aim is to help developers integrate strong password security practices into their software development process to prevent data breaches. Experimental results show the effectiveness of this approach.

Security is crucial across various applications, including cloud storage and messaging. While AES, DES, and RSA are common encryption methods, relying solely on one can lead to vulnerabilities if the encryption key is compromised. To address this, hybrid cryptography is employed in this research [ 12 ], combining existing techniques with three new methods. Data is divided into three sections and encrypted with AES, DES, and RSA respectively. Encryption keys are stored using LSB steganography in an image, ensuring additional security. Users must retrieve the keys from the image to access and decrypt the data stored in the cloud, enhancing overall security. Further, Castillo et al. [ 17 ] present a new mobile app that secures images using AES encryption and LSB steganography. It employs a 256-bit AES key for robust protection and utilizes the Diffie-Hellman algorithm for secure key exchange. The app development follows the Rapid Application Development Model, ensuring iterative refinement and early testing. Evaluation based on ISO/IEC/IEEE 29119 Testing Standards indicates user satisfaction with an overall mean rating of 4.17.

As mentioned above, one of the interesting areas is Cloud Computing (CC) which has emerged as a popular model for delivering services over the Internet, with Software as a Service (SaaS) being a prominent example. Despite its benefits, security remains a concern. This paper [ 24 ] presents an application model for securing SaaS applications hosted on private clouds. The model consists of two micro-services: an Application Layer Firewall to prevent malicious activity, and a secure login application for sensitive data transmission. Additionally, a Hidden Markov Model layer is implemented for intrusion detection. The second micro-service uses Advanced Encryption Standard (AES) for document encryption within the private cloud. Further security is provided through a novel Video Steganography approach using the Least Significant Bit (LSB) technique. Overall, the paper outlines a comprehensive approach to enhance security in SaaS applications.

Further, considering confidentiality and integrity important aspects for sharing confidential information while communication, the research [ 38 ] introduces “Stag-chain”, a blockchain-based design combining steganography, AES encryption, and InterPlanetary File System (IPFS) Protocol for decentralized cloud storage. The image file is stored on the cloud temporarily, replaced by a normal image afterward. This scheme aims to develop an app ensuring data confidentiality, secure data transmission, and protection against unauthorized access. Furthermore, Madavi et al. [ 43 ] introduce a compact steganography technique for robust data hiding while maintaining perfect invisibility. It combines DES, AES, and RC4 encryption methods for enhanced security. The study aims to achieve data security using steganography with the Least Significant Bit (LSB) Algorithm and Hybrid Encryption, encrypting user input and concealing it within image files for maximum security during message transmission.

Additionally, the authors [ 51 ] introduce a highly secure web-based authentication system utilizing Image Steganography and the 128-bit Advanced Encryption Standard (AES) algorithm. This system encrypts user passwords using AES. Also, face identification photographs are used as stegoimages to conceal the encrypted passwords, further enhancing security. The proposed work demonstrated resilience against advanced steganalysis attacks, including the chi-squared attack and neighborhood histogram. The authors recommended this secure authentication method for future web applications dealing with sensitive user data.

In [ 65 ], the authors investigate using audio signal processing for cybersecurity in voice applications. As voice interfaces become ubiquitous in devices, the research focuses on securely identifying and authenticating users through cryptography and audio steganography, ensuring both security and usability. Also, the paper [ 66 ] introduces security strategies aimed at enhancing data protection in the cloud, addressing concerns such as confidentiality, accessibility, and integrity. By leveraging steganography, encryption-decryption techniques, compression, and file splitting, our proposed approach aims to overcome the limitations of traditional data protection methods, providing clients with an effective and secure means to store and share information.

Further, the transmission of satellite images via the Internet has gained considerable attention, especially with the rise of cloud and web-based satellite information services. Ensuring secure and high-quality data transfer to users has become a priority. To address this in the research [ 70 ], a combination of steganography and cryptography techniques is employed. Steganography hides data within images, audio, or video, while cryptography ensures data remains unintelligible to cyber attackers. This fusion approach offers a unique method for information protection. The paper proposes combining steganography algorithms such as Least Significant Bit (LSB) and Most Significant Bit (MSB) with cryptography algorithms like Rivest-Shamir-Adleman (RSA) for enhanced security.

This is another interesting research under technology development [ 73 ]. The rise of multimedia applications has fueled the use of digital archives, with cloud storage being a common choice for storing, transmitting, and sharing multimedia content. However, the reliance on cloud services poses security risks, compromising data privacy. To mitigate these risks, data access is restricted to authenticated users, and data is encrypted before storage in the cloud. Cipher Text-Policy Attribute-Based Encryption (CP-ABE) is used to encrypt data and control access, but traditional CP-ABE requires substantial computing resources. To address this, an efficient pairing-free CP-ABE scheme using elliptic curve cryptography is proposed, reducing memory and resource requirements. However, even with CP-ABE, plaintext retrieval is easier with cryptanalysis. To enhance data security and ownership, cryptography is combined with steganography, embedding ciphertext into images to thwart cryptanalysis and improve data security and privacy, particularly for multimedia applications.

Further, Modern healthcare relies on secure medical imaging systems for accurate diagnosis. This paper [ 79 ] proposes a method to protect the JPEG compression processor used in these systems from threats like counterfeiting and Trojan insertion. By integrating robust structural obfuscation and hardware steganography, the approach ensures double-layered defense with minimal design cost. Also, Online shopping presents risks such as credit card fraud and identity theft. This paper [ 80 ] introduces a novel scheme to detect and prevent phishing sites using extended visual cryptography, steganography, and an Android application. The scheme reduces user interaction by automatically uploading shares and QR code details during authentication, enhancing security by minimizing errors from manual intervention.

Ensuring image security and copyright protection, especially post-COVID-19, is challenging. This paper [ 98 ] introduces SecDH, a medical data hiding scheme designed to address these challenges specifically for COVID-19 images. The scheme begins by normalizing the cover image to enhance resistance against geometric attack and computes a normalized principal component for embedding. Experimental results show SecDH's imperceptibility and advantages over traditional schemes. In a similar line, this research [100] introduces a robust technique with a high embedding capacity for color images. By fusing multi-focus images using NSCT and computing hash values for authentication, the technique enhances information richness and security. Embedding the fused image and hash value into the cover media using transformed-domain schemes, along with encryption, ensures higher security. Additionally, a hybrid optimization algorithm computes an optimal factor for improved imperceptibility and robustness. Experimental results demonstrate the technique's effectiveness and resistance to common attacks, achieving a 9.5% increase in robustness and an 8.8% enhancement in quality compared to existing works.

Further, the research [ 99 ] proposes SIELNet, a robust encryption algorithm for color images. Utilizing a novel chaotic map and custom network, SIELNet ensures secure data transmission and storage. Experimental results validate its superior performance, promising enhanced data integrity in Industry 5.0.

Furthermore, the evaluation of these techniques relies on a diverse set of metrics that assess their performance in terms of security, robustness, capacity, perceptual quality, and statistical characteristics. This research background provides an overview of the key evaluation metrics, tools, and attacks used for steganography, and cryptography, including their definitions and significance in assessing the effectiveness of covert communication methods. With the help of the following information on evaluation criteria, tools, and attacks, numerous research papers spanning both cryptography and steganography domains have been analyzed and are presented in Table 10 . This provides readers with in-depth information to facilitate their understanding of the Results section with clarity.

Evaluation criteria

Peak signal to noise ratio (PSNR) [ 1 , 5 , 7 , 92 , 95 , 97 ] PSNR is a widely used metric in image processing that quantifies the quality of reconstructed signals by measuring the ratio of the peak signal power to the noise power. In steganography, PSNR is employed to evaluate the perceptual quality of stego images by comparing them to their original counterparts, with higher PSNR values indicating better image fidelity. The PSNR of the grey-level image is defined as follows:

Mean square error (MSE) [ 1 , 22 , 92 , 95 , 97 ] MSE measures the average squared difference between the pixel values of the original and reconstructed signals, providing a quantitative measure of reconstruction accuracy. In steganography, MSE is utilized to assess the distortion introduced by embedding hidden data, with lower MSE values indicating reduced perceptual distortion.

Correlation coefficient (CC) [ 1 , 9 , 22 , 95 ] CC serves as a robust metric commonly applied to evaluate message correlation, particularly within image formats through median filtering. While not extensively employed in steganography, its utility is more pronounced when messages adopt image form. In the realm of image watermarking, CC finds wider usage owing to the prevalent image-based nature of watermarks. Notably, CC's modus operandi doesn't hinge on error quantification but centers on computing the correlation between original message image pixels and their counterparts extracted from the message. Consequently, CC values, ranging from −1 to 1, signify correlation strength, with 1 denoting optimal correlation. Its computation can be executed using the following equation:

Capacity [ 5 , 92 , 95 , 97 ] Capacity refers to the maximum amount of hidden information that can be embedded within a cover signal without causing perceptible distortion. In steganography, capacity metrics assess the payload capacity of steganographic algorithms, guiding the selection of embedding techniques to achieve a balance between data hiding capacity and perceptual quality.

Structural similarity index (SSIM) [ 7 , 22 , 33 , 96 ] It's a metric used in image processing to quantify the similarity between two images. SSIM considers luminance, contrast, and structure, mimicking human visual perception. It's widely used in research to evaluate the quality of image compression, denoising, and restoration algorithms.

Human visual system (HVS) metrics [ 7 , 95 ] HVS metrics model the perceptual characteristics of the human visual system to evaluate the visual quality and perceptibility of stego signals. In steganography, HVS metrics such as the Structural Similarity Index (SSIM) and perceptual entropy are utilized to assess the visibility of embedded data and ensure imperceptibility to human observers.

Entropy [ 33 , 88 , 96 ] Entropy measures the randomness or uncertainty of a signal and is used to quantify the information content of cover and stego signals. In steganography, entropy metrics assess the statistical properties of stego signals, with lower entropy values indicating a higher degree of hidden information. The entropy can be calculated for an 8-bit image as follows:

\(H\left(I\right)=-\sum_{i=1}^{{2}^{8}}{P(I}_{i}){log}_{b}P{(I}_{i})\) , where I denote the Intensity value, and \({P(I}_{i})\) represents the probability of intensity value \({I}_{i}\) .

Histogram analysis [ 9 , 45 , 92 , 95 , 97 ] Histogram analysis examines the distribution of pixel intensities in cover and stego signals to detect statistical anomalies introduced by steganographic embedding. In steganalysis, histogram-based metrics evaluate the statistical differences between cover and stego signals, facilitating the detection of hidden information.

Bit error ratio (BER) [ 9 , 13 , 22 , 95 ] BER quantifies the ratio of incorrectly received bits to the total number of transmitted bits and is used to measure the accuracy of data transmission in digital communication systems. In steganography, BER is employed to evaluate the accuracy of data extraction from stego signals, with lower BER values indicating a higher level of data integrity.

Bits per pixel (BPP) [ 41 , 61 , 85 , 95 ] BPP measures the average number of embedded bits per pixel in stego images and is used to quantify the embedding efficiency of steganographic algorithms [ 96 ]. In steganography, BPP metrics assess the trade-off between embedding capacity and visual quality, guiding the selection of embedding parameters.

Signal-to-noise ratio (SNR) [ 14 , 95 ] SNR measures the ratio of signal power to noise power and is used to quantify the quality of transmitted signals in communication systems. In steganography, SNR metrics evaluate the robustness of steganographic algorithms to noise interference, with higher SNR values indicating better signal quality.

Amplitude difference (AD) [ 14 ] AD measures the difference in amplitude or magnitude between the original plaintext and the corresponding ciphertext resulting from the encryption process. It quantifies the level of distortion introduced during encryption, with lower AD values indicating minimal alteration in amplitude between the plaintext and ciphertext. The assessment of AD aids in evaluating the perceptual quality and robustness of cryptographic algorithms, ensuring that encrypted data retains fidelity and is resistant to unauthorized tampering.

Avalanche effect (AE) [ 14 ] AE characterizes the sensitivity of a cryptographic algorithm to small changes in the input, resulting in significant changes in the output ciphertext. A robust cryptographic algorithm exhibits a pronounced avalanche effect, where even minor modifications in the input plaintext lead to extensive changes in the resulting ciphertext. AE plays a pivotal role in assessing the security and strength of encryption algorithms, as it indicates the extent to which encrypted data conceals underlying patterns and resists cryptanalysis attempts aimed at deciphering the original plaintext.

Bits per code (BPC) [ 14 ] BPC refers to the average number of bits used to represent each symbol or code in a given data encoding scheme or communication system. It quantifies the efficiency of data representation and transmission by measuring the ratio of the total number of bits to the total number of codes or symbols transmitted. In data encoding and compression techniques, a lower BPC indicates higher efficiency in representing data using fewer bits, while ensuring minimal information loss or distortion.

Throughput [ 14 ]: Throughput represents the rate at which data is successfully transmitted or processed over a communication channel or system within a specific time. It measures the amount of data transferred per unit time and is typically expressed in bits per second (bps) or a similar unit of data transmission rate. Throughput is influenced by factors such as channel bandwidth, data encoding efficiency, error correction mechanisms, and system latency. Higher throughput values indicate greater data transmission capacity and efficiency, enabling faster and more reliable communication.

Uncorrectable error rate (UER) [ 14 ] UER is a metric used in error detection and correction systems to quantify the frequency or probability of errors that cannot be successfully detected or corrected by error correction mechanisms. It represents the rate of errors that remain undetected or uncorrected despite the implementation of error detection and correction techniques. A low Uncorrectable Error Rate is desirable in communication systems, indicating a high level of reliability and effectiveness in error detection and correction processes.

Cronbach’s alpha (CA) [ 19 ] Cronbach's alpha is a measure of internal consistency and reliability of steganographic or cryptographic algorithms. It ensures that they consistently perform as intended across different datasets or scenarios.

Composite reliability (CR) [ 19 ] Composite reliability is another measure of internal consistency reliability, similar to Cronbach's alpha. It evaluates the reliability of a set of items in measuring a latent construct, taking into account the factor loadings of the items.

Average variance extracted (AVE) [ 19 ] AVE is a measure of convergent validity in structural equation modeling (SEM). It assesses the amount of variance captured by a latent construct in relation to the variance due to measurement error.

Structural equation modeling (SEM) [ 19 ] SEM is a statistical method used to test and validate theoretical models that specify relationships among observed and latent variables. It allows researchers to assess the structural relationships between variables and evaluate the goodness-of-fit of the proposed model.

Normalized chi-square (Normalized χ2) [ 19 ] Normalized chi-square is a goodness-of-fit measure used in SEM, indicating the discrepancy between the observed and expected covariance matrices relative to the degrees of freedom.

Goodness-of-fit index (GFI) [ 19 ] GFI is a measure of the overall fit of the structural equation model to the observed data. It assesses the extent to which the model reproduces the observed covariance matrix.

Root mean square error (RMSE) [ 19 ] RMSE is a measure of discrepancy between observed and predicted values in SEM. It quantifies the average difference between observed and model-estimated covariance matrices, with lower RMSE values indicating better model fit.

Normed fit index (NFI) [ 19 ] NFI is a goodness-of-fit index in SEM that evaluates the relative improvement in the fit of the proposed model compared to a null model. Higher NFI values indicate a better fit.

Tucker lewis index (TLI) 19] TLI, also known as the Non-Normed Fit Index (NNFI), is a measure of incremental fit in SEM. It compares the proposed model to a baseline model with uncorrelated variables, with TLI values close to 1 indicating a good fit.

Comparative fit index (CFI) [ 19 ] CFI is another measure of incremental fit in SEM, assessing the improvement in the fit of the proposed model relative to a null model. CFI values close to 1 indicate a good fit.

Normalized cross-correlation coefficient (NCCC) [ 33 ] NCCC is employed to measure the similarity between the cover and stego-images. A high NCCC value close to 1 signifies that the steganographic process has been performed effectively, resulting in minimal detectable differences between the original cover image and the stego-image, thereby ensuring the concealment of hidden information within the cover image. This can be evaluated as \({\gamma }_{p,q}=\frac{cov(p,q)}{\sqrt{D(p)\sqrt{D(q)}}}, with D\left(p\right),\) where p and q represent two variables that can denote either the secret and decrypted images in the cryptography process or the cover and stego-images in the steganography process. The correlation coefficient is \(\gamma \) , and each of the \(cov(p, q)\) , \(D(p),\) and \(D(q)\) , correspond to the covariance and variances [ 33 ] of these variables p and q.

Number of pixel change rates (NPCR) [ 33 ] The NPCR metric is utilized during the encryption stage to evaluate the disparity between cipher images before and after a single pixel alteration in a plaintext image. Let P represent the total number of pixels, where C1 and C2 denote the cipher images before and after the pixel change, respectively. Additionally, D is a bipolar array defined such that \(D(i,j)=0\) \(if C1(i,j)=C2(i,j), and D(i,j)=1\) otherwise. The NPCR determines the percentage of differing pixel values between the original and encrypted images. This metric gauges the resilience of the encryption method against potential intrusions and attacks, with higher NPCR values indicating a stronger strategy. \(N\left(C1,C2\right)=\sum_{i,j}\frac{D(i,j)}{P}\times 100\%\) .

Unified average changing intensity (UACI) [ 33 ] UACI calculates the mean intensity of variances between two images with the following formula: \(UACI=\frac{1}{2}[\sum_{pq}\frac{{{I}_{1}\left(p,q\right)}_{-}{I}_{2}\left(p,q\right)}{255}]\times 100\) , where \({I}_{1}\) , \({I}_{2}\) represent the two encrypted images derived from the original image by altering a single pixel, with, p and q denoting the coordinates of the pixels being considered \({I}_{1}\) , and \({I}_{2}\) respectively.

Percentage residual difference (PRD) [ 34 ] This metric assesses the variance between the original ECG host signal and the resulting watermarked ECG signal, calculated as \(PRD=\sqrt{\frac{\sum_{i=1}^{N}{(}^{{{x}_{i}-{y}_{i})}^{2}}}{{\sum }_{i=1}^{N}{{x}^{2}}_{i}}}\) , where \(x\) represents the Original ECG signal, and y is the watermarked signal.

Weighted wavelet percentage residual difference (WWPRD) [ 34 ] This metric is used particularly in the context of watermarking techniques. It is employed to evaluate the effectiveness of image watermarking algorithms by quantifying the perceptual differences between the original image and the watermarked version. In WWPRD, the residual difference between the original image and the watermarked image is calculated in the wavelet domain. By analyzing the WWPRD values, researchers can assess the trade-off between watermark invisibility (how imperceptible the watermark is to human observers) and robustness (how resistant the watermark is to various image processing operations and attacks).

Steganography and steganalysis tools used

Stegdetect [ 10 ] This tool is designed to detect and analyze hidden information within digital media, providing users with powerful steganalysis capabilities. It employs advanced algorithms and techniques to identify subtle modifications or anomalies in digital files that may indicate the presence of hidden information. StegDetect [ 10 ] is widely used by digital forensics experts, law enforcement agencies, and cybersecurity professionals to uncover hidden threats and investigate potential security breaches.

Steganalysis attack [ 61 , 75 , 86 ]

Salt and pepper noise Salt and Pepper Noise, also known as impulse noise, introduces sporadic white and black pixels in an image, resembling grains of salt and pepper scattered throughout the image. This type of noise typically occurs due to errors in data transmission or faults in image acquisition devices.

Additive white gaussian noise (AWGN) AWGN is a type of noise that follows a Gaussian distribution and is characterized by its constant power spectral density across all frequencies. It represents random variations in pixel values added to the original image, often resulting from electronic interference or sensor noise in imaging devices.

Median filtering Median filtering is a spatial domain filtering technique commonly used to remove impulsive noise such as Salt and Pepper Noise. It replaces each pixel value with the median value of its neighboring pixels within a defined window, effectively reducing the impact of outliers caused by noise.

Lowpass filtering Lowpass filtering is a technique used to suppress high-frequency components in an image while preserving low-frequency information. It is commonly employed to mitigate noise by smoothing the image, thereby reducing the effect of high-frequency noise components such as AWGN.

Weiner filtering Weiner filtering is a signal processing technique used to deconvolve images corrupted by additive noise, such as AWGN. It employs a frequency domain approach to estimate and suppress the noise while enhancing the signal-to-noise ratio in the restored image.

Sharpening Sharpening techniques aim to enhance the perceived sharpness and clarity of an image by accentuating edges and details. However, when applied to noisy images, sharpening can exacerbate the visibility of noise, making it a potential target for attacks aimed at degrading image quality.

Histogram equalization attack Histogram equalization is a technique used to adjust the contrast of an image by redistributing pixel values across a wider dynamic range. However, adversaries can exploit this technique to amplify the visibility of noise, especially in regions with low contrast, thereby degrading the overall quality of the image.

Rotation attack Rotation attacks involve rotating an image by a certain angle, which can introduce geometric distortions and potentially exacerbate the visibility of noise. Adversaries may employ rotation attacks to degrade the quality of images, particularly those affected by noise, as part of malicious activities or security breaches.

Pitch removal attacks These involve the removal or alteration of specific pitch frequencies in audio signals. These attacks are often used in scenarios where certain frequency components need to be suppressed or modified, such as in audio watermarking or enhancement techniques.

Bit-plane removal attacks This type of attack targets the bit-plane decomposition of images. In digital image processing, images are often represented using a bit-plane decomposition, where each bit-plane represents a different level of image detail or intensity. Bit-plane removal attacks aim to remove or modify specific bit-planes, thereby altering the visual appearance or content of the image.

Chi-Square attack [ 89 ] This is a prominent technique used to detect the presence of hidden information within digital media, particularly images. This attack leverages statistical analysis to uncover inconsistencies or anomalies in the distribution of pixel values within an image. The rationale behind the Chi-Square Attack lies in the fact that steganographic embedding typically introduces subtle changes to the statistical properties of an image, such as the distribution of pixel values. These changes, while imperceptible to the human eye, can be detected through statistical analysis methods like the chi-square test.

Regular singular (RS) analysis [ 75 ] RS analysis involves analyzing the regular and singular components of an image to identify irregularities or inconsistencies introduced by steganographic embedding techniques. This analysis leverages mathematical properties to distinguish between the regular content of an image and any additional hidden data.

Binary similarity measures (BSM) analysis [ 75 ] BSM are statistical measures used to assess the similarity between the binary representations of two images. In steganalysis, these measures are employed to compare the binary data of an original image with that of a potentially steganographic image. Deviations or discrepancies in binary similarity may indicate the presence of hidden data.

The next section discusses the methodology employed by this research.

3 Methodology

In this section, we outline a reproducible search strategy employed for conducting a comprehensive literature survey. Initially, data collection was performed utilizing the selected databases, namely Footnote 2 Scopus, Footnote 3 IEEE Digital Library, and Footnote 4 ISI Web of Science, with search queries formulated as detailed in Sect. 3.1 . Subsequently, the study selection process was executed, elucidated in Sect. 3.2 . Finally, the final data was extracted from the literature, as described in Sect. 3.3 . The Footnote 5 Parsifal tool was employed to optimize the efficiency of the review process, including the tasks of reviewing, screening, and extracting relevant literature.

3.1 Data gathering (DG)

The initial step in the literature exploration process involves data gathering. Two distinct literature searches were conducted: one encompassing journal articles and a supplementary search focused on conference papers. These papers are also discussed in Sect. 2 . The results of the additional literature search contribute primarily to gaining further insights related to RQ1. To effectively explore the selected databases, essential keywords, and criteria were identified. While both literature searches share common keywords, their criteria, such as publication year and language, were slightly adjusted to ensure a manageable scope. These criteria were refined through an iterative process that involved fine-tuning the keywords and assessing the quantity of relevant literature available on Scopus. The final keywords used for the search query can be expressed as follows:

Search Query: ("cryptography" AND "steganography") AND ("application" OR "real-world") AND ("security" OR "cyberattack" OR "cybersecurity").

Upon utilizing the specified keywords, the three databases collectively yielded a total of 749 results as of May 24th, 2023. Subsequently, inclusion criteria, encompassing year, language, and type, were applied to filter the obtained results. The application of these criteria is detailed in the following two sub-sections.

(a) DG-Literature Search 1: Journal Articles. A comprehensive literature search was conducted specifically for journal articles, with the databases accessed on May 24th, 2023. The criteria applied to this search are as follows:

Only literature published from 2010 onwards was included.

The literature must be classified as a journal article, excluding review papers, conference papers, books, and other sources.

Publications from any region are considered, but they must be in English.

The search encompassed the examination of titles, abstracts, and keywords. These criteria collectively establish the following additional query options:

year >  = 2010

AND language =  = English

AND type =  = Journal Article

These search criteria, along with the keywords from Sect. 2.1, resulted in the total number of 217 journal articles:

Scopus: 179

Web of Science: 31

After removing duplicates using the Parsifal tool, 194 journal articles were left for further analysis.

(b) DG-Literature Search 1: Conference Papers. Furthermore, a supplementary literature search focusing on conference papers was conducted, with the databases accessed on June 23rd, 2023. The search criteria and query vary slightly from the previous literature search as outlined below:

Only conference papers from conference proceedings published from 2018 onwards were considered.

Review papers, journal articles, books, and other sources were excluded.

Similar to the previous search, publications from any region were eligible, provided they were in English.

These criteria lead to the following query options:

year >= 2018

AND language == English

AND type == Conference Paper

AND source type == Conference Proceedings

These search criteria, along with the keywords from Sect. 2.1, resulted in a total number of 147 conference papers:

Web of Science: 11

After removing duplicates using the Parsifal tool, 113 conference papers were left for further analysis.

3.2 Study selection (SS)

The subsequent stage of the literature exploration process involves the selection of pertinent studies, which comprises two distinct phases. The following are seven conditions established to ensure that only literature addressing the research questions outlined in Sect. 1 is considered while filtering out literature of insufficient quality. It is important to note that the two literature searches applied these conditions differently

The paper focuses on researching the combination of cryptography and steganography disciplines.

The paper investigates the application of cryptography and steganography within specific domains (e.g., medical, military, financial) or contexts rather than a general application for "secure communications."

The paper addresses efforts to enhance the security of a system or process rather than solely transmitting additional data.

Is the objective of the paper clearly defined?

Have related works been adequately studied?

Is the methodology employed in the paper clearly described?

Are the results presented clearly and measurably?

(a) SS-Literature Search 1: Journal Articles. In the first literature search focused on journal articles, papers were assessed for relevance based on conditions 1–3 (Sect. 3.2 ), considering the information presented in the title and abstract. Subsequently, papers were further scrutinized to determine if they met conditions 4–7 (Sect. 3.2 ) by examining their contents. Only papers that fulfilled all seven conditions were included in the selection process. As a result of this rigorous selection process, the initial total of results was reduced to 24 journal articles. The flow chart depicted in Fig. 2 a illustrates the sequential steps involved in data gathering and study selection. Papers that discussed no specific application, such as "secure communications," were not categorized as such since a significant number of such papers were already omitted during the search query phase. Including them in the list would have resulted in an incomplete compilation of relevant articles.

(b) SS-Literature Search 2: Conference Papers. In the second literature search, focusing on conference papers, the selection process entailed examining papers for conditions 1–3 (Sect. 3.2 ) based on the information presented in the title and abstract. These conditions were crucial in determining whether a paper should be considered for RQ1. As a result of this selection process, 21 conference papers met the criteria. It is worth noting that two papers were identified as having been released before 2018 and were subsequently manually filtered out. The flow chart illustrated in Fig. 2 b provides a visual representation of the data-gathering process and study selection for this search.

3.3 Data extraction (DE)

The third step of exploring literature is extracting data. Data extraction consists of two parts, both performed using Parsifal. To answer RQ1 features related to a paper’s application have been extracted (both literature searches). The list of features evolved during the process of extraction as it was expanded, restructured, and finalized (Sect. 3.1 ) to encompass all encountered literature. Next, to answer RQ2 and RQ3, information related to the algorithms and metrics, advantages, limitations, and evaluation methods discussed by the literature were extracted (only literature search 1: journal articles). The results of data gathering, study selection, and data extraction are presented in the subsequent sections.

In this section, we present the comprehensive findings derived from the systematic review, addressing the research questions outlined in Sect. 1 . To facilitate a better understanding of the findings, figures, and tables are provided. The subsequent sections are organized in alignment with the order of the research questions. Section 4.1 delves into the encountered types of applications and explores potential categorization approaches. Additionally, Sect. 4.2 discusses the applications, their limitations, and advantages identified during the review process. Lastly, Sect. 4.3 focuses on the analysis methods employed in the literature. By following this structured arrangement, we aim to provide a clear and cohesive presentation of the research findings, offering valuable insights into the combined application of steganography and cryptography in various domains and contexts.

4.1 RQ1: exploring applications

For each study, relevant characteristics pertaining to the context in which the combined application of steganography and cryptography is explored were extracted. The analysis of the literature emphasizes the significance of categorizing the application of each article in two distinct ways:

The application domain : This refers to the specific industry sector or domain in which an application operates. The encountered application domains include financial, government, medical, and transportation.

The technological domain/technology [72]: This aspect involves identifying one or more technological topics associated with an application. Technologies are considered tools that can be employed across various domains to solve diverse problems or perform various tasks. The encountered technologies include Big Data, Blockchain, Cyber-Physical Systems (CPS), Cloud Computing (Cloud), Edge Computing (Edge), Fog Computing (Fog), Internet of Things (IoT), IPv6, Machine Learning (ML), Mobile Computing (Mobile), Personal Computing (Personal), Satellite Imaging (Satellite), Unmanned Aerial Vehicles (UAVs), and Voice Operated Systems (Voice).

By employing these two distinct categorizations, namely Application Domain and Technological Domain , it becomes possible to identify specific commonalities and differences within the applications. This facilitates informed research and the development of tailored solutions for specific application domains or technologies. Notably, this categorization approach differs from how other reviews, as exemplified by [ 45 ], typically categorize applications. While some studies may focus on applications specific to a particular application domain, such as the medical domain, other articles ([ 1 , 7 , 9 , 14 , 19 , 33 , 36 , 85 , 86 , 90 ] [ 5 , 10 ]) may exclusively concentrate on applications within a technological domain. A technological domain can be applicable across numerous application domains. Given these considerations, categorization by application domain is given precedence, and cases, where the application domain or technological domain could not be determined, have been excluded from categorization. Furthermore, irrespective of the application or technological domain, the specific focus or functionality of each application is also determined.

Functionality: This refers to the specific features, tasks, or roles performed by an application within its domain. It is important to note that security is considered a common role across the explored literature and is, therefore, not specified as functionality. Examples of functionalities include Smart Monitoring, Anonymization, Healthcare Data Transmission, Vehicle Diagnostics, Malware Detection, and Industry 4.0/5.0 Implementation.

The subsequent sections present the results obtained from both literature searches, providing further insights into the combined application of steganography and cryptography.

4.1.1 Journal articles

The findings from literature search 1, pertaining to RQ1, are presented in two tables. Table 1 provides an overview of articles and their corresponding application domains, while Table 2 focuses on the technologies employed, reflecting the split categorization approach. The core functionality of each paper studied for this review is explicitly mentioned in both tables. In cases where certain studies solely concentrate on a technological domain, potential application domains have been specified in italics (please refer to Table 2 ). These application domains are either suggested by the authors themselves or inferred based on similar literature. It is worth noting that technology often has applicability across a broader range of application domains. In such instances, the application domain is identified as ‘Cross-Domain.’ As showcased in Table 1 and Table 2 , a total of 12 journal articles from each table were analyzed, with each article focusing on distinct application domains and their corresponding technological domains.

Figure 3 a displays journal articles published from 2010 to 2023, categorized by application domains (Medical, Government, and Transportation) or technological domains (N/A). The figure reveals a modest increase in articles exclusively centered on technological domains, surpassing those focused on application domains. Considering the diverse potential of these technologies across various application domains (e.g., IoT [ 63 ]), it is advisable to prioritize innovation in a broader sense. Subsequently, refining these technologies for specific application domains holds the potential for even greater rewards. On the other hand, Fig. 3 b presents conference papers published between 2018 and 2023. In addition to the medical domain, as observed in the Journal articles shown in Fig. 3 a, there is a notable trend toward the financial domain in conference papers in the realm of combining and applying cryptography and steganography. More information on the Conference papers can be found in subsection 4.1.2.

figure 3

Distribution of literature in application and technological domains (N/A) over time

Figure 4 provides visual representations of the distribution of application domains (Fig. 4 a) and technological domains (Fig. 4 c) based on the data presented in Table 1 . In Fig. 4 a, it is evident that the majority (n = 9, [ 13 , 22 , 34 , 41 , 45 , 56 , 61 , 78 , 88 ]) out of the total 12 articles focus on the medical domain, suggesting a relatively narrow focus of research in this area. Furthermore, only a small number of articles concentrate on governmental applications (n = 2, [ 75 , 87 ]) and transportation (n = 1, [ 48 ]). Similarly, the occurrences of technological domains are visualized in Fig. 4 c. Notably, technologies with an occurrence of 1 are grouped under 'Other,' which includes Big Data, Fog Computing, Web Applications, Personal Computing, Edge Computing, and Cyber-Physical Systems. The visualization in Fig. 4 b and d is completed in subsection 3.1.2, where the conference papers are analyzed in depth. After analysis of the journal articles, it becomes evident that only one article from 2018 focuses on an application in the Transportation domain. Furthermore, in both 2021 and 2022, there is a lack of publications in the medical domain, whereas the four preceding years had such publications. Another notable observation is the surge in articles focusing on a specific technology in 2022. However, the applications discussed in these articles ([ 7 , 36 , 86 , 90 ]) seem unrelated, making it challenging to identify any underlying reason behind this trend.

figure 4

Distributions of domains of journal articles (left) and conference papers (right)

Furthermore, an attempt was made to employ the VOSviewer Footnote 6 tool to identify any authorship overlap among the identified journal articles. However, none of the articles displayed any shared authors, indicating a dispersed distribution of researchers working on the topic. This suggests that research on the combined approach of steganography and cryptography is relatively new, aligning with the increasing trend observed in the number of articles over the past 13 years in Fig. 3 a. However, it is important to consider that additional factors may contribute to this observation. A more detailed discussion of journal articles focusing on specific application domains is provided in Sect. 3.2 .

4.1.2 Conference papers

To gain further insights, conference papers were also subjected to analysis. The results of this additional literature search are presented in Tables 3 and 4 , providing additional data for a comprehensive review. Similarly, to journal papers, the publication years of conference papers are depicted in Fig. 3 b. Notably, there has been a relatively consistent number of papers published each year, suggesting either a sustained interest in combining steganography and cryptography or a stabilization of the field following a previous period of change. However, due to time constraints, papers published before 2018 were not explored in this study. Surprisingly, from 2018 to 2023, out of the 21 papers analyzed, only a few (n = 5) focused on specific application domains (as seen in Fig. 4 b). These papers predominantly spanned the medical domain (n = 3, [ 23 , 29 , 47 ]) and a newly emerging financial domain (n = 2, [ 52 , 62 ]). Once again, the medical domain emerged as the most popular area of application. Furthermore, while approximately 50% of the identified literature in journal articles explored applications in specific domains, only 24% of conference papers did the same. This disparity may further emphasize the trend of developing technologies in a more generalized sense rather than focusing exclusively on specific application domains. Similarly, Fig. 4 d showcases the technological domains, revealing the presence of three prominent technologies shared between journal articles and conference papers: Mobile Computing, the Internet of Things, and Cloud Computing, with Cloud Computing being particularly prevalent. It should be noted that making a direct comparison between the two searches is challenging due to the difference in the time covered by the literature.

4.2 RQ2: advantages, limitations, and trade-offs

In this section, we discuss the observations made regarding the algorithms and methodologies employed in the journal articles. Firstly, we present general observations, and subsequently, we delve into the three application domains encountered in the journal articles, namely Government, Medical, and Transportation The research papers are also sorted out (as listed in Table 1 ). The research papers are also arranged in ascending order according to these three categories. The full data collected for this RQ2 can be found in Table 5 . Further, there are other categories identified from Tables 1 and 2 , such as Cross-Domain, Medical-Military, Energy, Medical, Finance, and Military; Cross-domain is also reflected in Table 5 . The research papers are discussed in Sect. 2 .

Government application domain

This category focuses on two articles [ 75 , 87 ] that explore the application of both steganography and cryptography in the government domain, specifically in the areas of surveillance and voting. These articles are listed in Table 6 . Each article presents different approaches with their respective strengths and limitations.

The first article [ 75 ] proposes a two-tiered video surveillance system that offers robustness against cipher-breaking attacks. However, the quality of the recovered data is dependent on the compression rate of the Compressed Sensing (CS) technique used. Additionally, the system could be enhanced to accommodate more than two levels of authorization.

The second article [ 87 ] introduces an online voting system that ensures individual verifiability and security. However, it is susceptible to certain security challenges, such as collusion among polling officers and network eavesdropping. The system provides receipts to voters, but this poses a potential issue in case users lose their receipts. Improvements, such as exploring alternative algorithms, may enhance the system's performance, such as reducing the size of receipts.

Overall, these articles highlight different aspects and considerations in the government domain when implementing steganography and cryptography, emphasizing both the strengths and areas for potential improvement in their respective approaches.

Medical application domain

This section focuses on nine articles that explore applications in the medical domain. The articles are listed in Table 7 , along with their respective advantages and limitations.

Among these articles, three papers ([ 56 , 61 , 88 ]) incorporate the use of chaotic algorithms in their encryption methods. For example, [ 56 ] presents a transmission system for generic data that utilizes chaotic encryption based on a 2D-Henon map ([ 84 ]). However, limited practical implementation details are provided, and future works could be drawn upon [ 53 ] for a more in-depth analysis of the implementation aspects. One drawback is that these three papers lack performance analysis and key measurements such as Computation Time (CT) and Throughput (TP) for the chaotic algorithms. This limitation hampers the assessment of their potential for real-time systems. Nevertheless, [ 61 , 88 ], which also employs chaotic encryption, can serve as inspiration for similar approaches. It should be noted that not all chaotic encryption algorithms, due to their complex iterative operations, are suitable for real-time systems. However, less resource-intensive methods like [ 60 ] could be considered viable alternatives. This aspect could be explored as a future research direction in the field.

Health data in IoT

Two papers ([ 22 , 34 ]) focus on health data transmissions from IoT devices, particularly in the context of remote patient monitoring. These devices typically prioritize low power consumption and low computational complexity. In [ 34 ], data is concealed within ECG signals, while [ 22 ] utilizes image steganography. Both papers employ encryption before embedding the data. In [ 34 ], the receiver must possess knowledge of the encryption and embedding keys, and no key is transmitted. On the other hand, [ 22 ] embeds both the data and the encryption key. [ 34 ] employs XOR cipher for its computational simplicity, while [ 22 ] utilizes AES ([ 30 ]) and RSA ([ 90 ]) encryption methods. It is worth considering more secure or efficient alternatives, such as TEA and its variants [ 50 ] or hardware-accelerated AES ([ 55 ]), for IoT devices. Both papers utilize multi-level DWT (Discrete Wavelet Transform) for steganography. These differences highlight the range of methodologies employed to safeguard patient data during IoT transmissions.

Embedding location restrictions

Among the medical papers focused on healthcare data transmissions, two ([ 41 , 88 ]) discuss methods that impose restrictions on data embedding locations. In [ 88 ], the Distance Regularized Level Set Evolution (DRLSE) algorithm [ 42 ] is utilized to identify the Region of Interest (ROI) and Non-Region of Interest (NROI) in a medical image. Data is embedded in the NROI using adaptive Pixel Expansion Embedding (PEE) to achieve higher capacity. For the ROI, a custom algorithm based on histogram-shifting with contrast enhancement is employed to ensure visual clarity. In this paper, data embedding is performed before image encryption. In contrast, [ 41 ] also identifies ROI and NROI areas, specifically in DICOM images. However, in this case, the encryption process is conducted before the identification of these areas. Edge detection techniques such as the Gabor Filter and Canny Edge [ 55 ] are employed for area identification. Patient data is only embedded in the NROI to preserve image quality. Additionally, to maintain the verifiability of integrity, which is crucial in medical applications, an ROI-generated hash is embedded in the NROI. These approaches demonstrate different strategies for data embedding in specific areas of medical images, highlighting the preservation of image quality, visual clarity, and the importance of integrity verification in healthcare applications.

Transportation Application Domain

An article focuses on an application in the transportation domain, and its advantages and limitations are listed in Table 8 . In [ 48 ], a system is proposed to securely deliver diagnostic data to manufacturers and handle firmware updates. Although the system is innovative, there could be potential drawbacks, such as extended decryption times and potential inefficiency when dealing with larger software updates. To address these challenges, future work could investigate the utilization of more efficient cryptographic algorithms and adapt the method to better accommodate larger files, which is common when dealing with updates. Moreover, future research in the transportation domain could explore vehicle-to-vehicle (V2V) networks, where minimizing the speed and size of communication is essential.

4.3 General observations

Several observations can be made regarding all the journal articles. Firstly, the steganography methods commonly employed in the identified applications primarily focus on images, as indicated in Table 9 .

There is a noticeable underutilization of other cover mediums such as audio, signal, hardware, video, and text. This gap in research highlights the need for further investigation in these areas. Within the medical domain specifically, 7 out of 9 articles utilize image steganography. The choice of image-based steganography in medical applications is effective, considering the frequent use of medical imaging. However, there is potential for diversifying data types by exploring other forms of steganography, such as video steganography in recorded surgeries or expanding signal steganography beyond ECG signals. This diversification would enhance the usability and robustness of steganography in various systems.

Secondly, in certain applications ([ 22 , 44 ]), the encryption key is embedded together with the data in the cover medium. This eliminates the requirement for a separate communication channel (in the case of dynamic keys) or pre-established cryptographic keys.

Thirdly, it is noteworthy that 42% of the identified articles, spanning various application and technological domains, incorporate a Reversible Data Hiding (RDH) technique. RDH techniques enable the lossless reconstruction of the original cover media after the hidden data has been extracted. This capability is particularly crucial in sectors such as healthcare, where preserving the integrity of the original data, such as medical imagery, is often of utmost importance [ 13 , 22 , 41 , 44 , 61 , 88 ].

Based on these findings, it is evident that there is a need to diversify research in terms of methods and cover mediums . Attention should be given to addressing security challenges in government applications, while a more comprehensive assessment of the performance of chaotic algorithms in medical domains is required. Additionally, there is a call for exploring a wider range of steganography methods for healthcare data transmissions. In the transportation domain, it is advisable to explore other cryptographic algorithms to effectively handle larger data files. Overall, research efforts can significantly enhance data security across various sectors by addressing these areas of improvement.

4.4 RQ3: analyzing evaluation methods used

In this section, we discuss the analysis and evaluation methods utilized in the Journal articles, which are listed in Table 10 . The analysis of steganography typically revolves around four main concepts: capacity, robustness, security, and imperceptibility (sometimes divided into undetectability and invisibility) [ 4 , 68 , 82 ]. On the other hand, cryptography evaluation focuses on security, encryption time, key size, plain vs. cipher size , and other related factors [ 26 , 83 ]. Considering the similarities between these concepts, they are grouped into three perspectives: Security, Performance, and User. These perspectives are interconnected and interdependent, as demonstrated in Fig. 5 .

figure 5

The three discussed analysis perspectives

4.4.1 Security perspective

Similar to cryptography, steganography can also be vulnerable to different types of attacks, such as ciphertext and plaintext attacks [ 49 ]. Steganography is susceptible to similar attack types, including known carrier and known message attacks [ 49 ]. The significance of safeguarding against these attacks is contingent upon the order in which steganography and cryptography are applied.

When data is embedded first and then encrypted, the primary defense against attacks lies in the strength of the encryption itself. Several articles, such as [ 13 , 57 , 87 , 88 ] (listed in Table 5 in section 3.2 ), follow this order of operations. Among these articles, some also address advanced attacks, including histogram equalization ([ 9 , 44 , 61 , 88 ]), while only one article tackles rotation attacks ([ 61 ]). Conversely, when data is encrypted first, the primary defense against attacks lies in the strength or imperceptibility of the stego object. The majority of applications follow this order of operations, as evidenced by articles such as [ 13 , 22 , 34 , 41 , 44 , 48 , 61 , 75 ], among others. These implementations primarily focus on achieving steganographic imperceptibility, utilizing metrics such as PSNR, SSIM, MSE, and BER. They heavily rely on cryptographic evaluations from previous works. Even articles proposing custom or more complex encryption methods ([ 34 , 44 , 48 , 56 , 61 , 88 ]) still analyze cryptographic security as an integral part of their evaluation.

The following insights are drawn based on the security perspective:

Vulnerability to attacks Similar to cryptography, steganography is prone to various attacks, including ciphertext and plaintext attacks. This underscores the necessity of implementing robust defenses to safeguard against potential security breaches.

Order of operations The sequence in which steganography and cryptography are applied influences the defense mechanisms against attacks. Whether data is embedded first and then encrypted, or vice versa, dictates where the primary defense lies, either in the strength of encryption or the imperceptibility of the stego object.

Advanced attack consideration Some articles address advanced attacks, such as histogram equalization and rotation attacks, highlighting the importance of considering sophisticated attack vectors that may compromise the invisibility of stego objects.

Emphasis on imperceptibility The majority of implementations prioritize achieving steganographic imperceptibility by encrypting data first. This emphasizes the importance of concealing hidden data within digital media while maintaining the appearance and quality of the original content.

Integration of cryptographic security Even articles proposing custom encryption methods analyze cryptographic security comprehensively. This integration underscores the interdependence between cryptographic measures and steganographic techniques in ensuring the overall security of hidden information.

4.4.2 Performance perspective

The performance of encountered systems can be influenced by several factors, including computation time (CT), capacity (related to steganography), and key size (related to cryptography).

Computation time , which encompasses both steganography and cryptography, is particularly important as it correlates with power consumption, making it a crucial consideration in real-time and power-sensitive systems. While some articles like [ 1 , 13 , 33 , 36 , 44 , 48 ] incorporate CT measurements, only two similar applications [ 1 , 36 ] specifically address the need for managing power consumption in their environments. CT measurements are often discussed as "total time" or analyzed individually for different components of the system, such as embedding time, extraction time, encryption time, and more. This approach allows for more targeted performance improvements. Interestingly, among the seven articles exploring applications in the Internet of Things (IoT), three articles [ 7 , 22 , 34 ] do not utilize time-based analysis metrics. This omission makes it challenging to accurately assess the performance and efficiency of their proposed applications. A time-based analysis is vital for a comprehensive understanding of application performance as it not only reveals the speed of processes but also provides insights into the efficient utilization of system resources.

Another significant metric to consider is capacity . The balance between imperceptibility and capacity holds importance depending on the specific application. In certain (real-time) applications where relatively small data fragments are shared, the capacity of the cover medium may not be as critical. In such cases, imperceptibility may also be of lesser relevance. Out of the 24 articles analyzed, capacity is evaluated in 9 articles [ 5 , 7 , 9 , 13 , 41 , 61 , 75 , 85 , 86 ], either in comparison to other implementations or by examining different parameters within the same implementation. It is worth noting that only one article ([ 7 ]) focusing on IoT applications specifically analyzed the capacity of the employed steganographic method. The key size in cryptographic algorithms can have a significant impact on encryption time, as explained in [ 40 ]. In the context of IoT, [ 22 ] specifically addresses cryptographic operations using an AES key size of 128 bits. Although AES-128 is generally regarded as secure, larger key sizes can be employed. The utilization of more efficient encryption algorithms could potentially allow for the use of larger keys while maintaining similar encryption times. Surprisingly, the discussion or justification of key sizes for well-known cryptographic algorithms does not appear to be frequently addressed in the analyzed literature.

Upon examining various research papers listed in Table 10 , the following insights regarding performance are observed:

The analysis revealed key considerations related to computation time, capacity, and key size.

Computation time was emphasized as critical due to its association with power consumption, especially in real-time and power-sensitive systems.

Capacity, concerning the balance between imperceptibility and capacity in steganography, was noted to vary depending on specific application requirements.

The analysis underscored the significant impact of key size selection in cryptographic algorithms on encryption time, highlighting the importance of careful consideration in algorithm design.

Despite the importance of these factors, the analysis revealed areas where certain metrics, such as time-based analysis in IoT applications, were lacking, making it challenging to comprehensively assess performance and efficiency.

4.4.3 User perspective

The user perspective evaluates how effectively a system incorporating steganography and cryptography aligns with the user's workflow, emphasizing factors such as ease of use, comprehension, trust, processing time, and system stability. The impact of the system on the user's workflow is particularly crucial for applications where the user directly interacts with the system. However, even in cases where the system operates in the background, it can still potentially influence the user's experience, albeit to a slightly lesser degree. From the reviewed literature, it is observed that only a limited number of studies include usability tests to analyze user experience. For instance, the implementation of an e-voting system discussed in [ 75 ] incorporates usability and user acceptance testing using Nielsen's quality components [ 58 ] and Davis' Technology Acceptance Model (TAM) [ 20 ], respectively. These well-established methods assess the usability and acceptance of the system. Similarly, the NFC access control scheme presented in [ 19 ] includes usability, perceived vulnerability, perceived security, and behavioral intention tests to examine how the proposed security scheme could influence user behavior. The methods utilized in this study were adapted from previous works [ 15 , 35 , 76 ].

Applications such as remote patient monitoring ([ 34 ]) aim to provide a user-friendly experience, requiring minimal complex setup from the user's perspective. It is mentioned that any additional complexity introduced by the implementation of steganography or cryptography should ideally be abstracted away from the user. However, the only user interaction highlighted in the article is related to the imperceptibility of the Human Visual System (HVS), where doctors inspect ECGs. Similarly, the application of hiding files in audio files on PCs [ 5 ] is closely related to end-users, but the article does not delve further into this aspect and omits user testing in this regard. This omission creates an evaluation gap, as it fails to comprehend the actual user experience and potential areas for improvement. User experience can be significantly influenced by other perspectives, such as security and performance. If the combination of steganography and cryptography leads to excessively slow data processing or if the system lacks robustness against attacks like compression or cropping, it could compromise the user's ability to effectively manage stego objects (e.g., share or post-process them). This vulnerability could potentially result in data loss or corruption, ultimately degrading the overall user experience. Therefore, robust implementations of steganography and cryptography are essential for maintaining a high-quality user experience.

After analyzing the User perspective criteria, we identify the following insights:

Despite the importance of user experience, there's a noted lack of usability tests in the reviewed literature, with only a few studies incorporating established methods like Nielsen's quality components and Davis' Technology Acceptance Model (TAM).

Applications aim to provide a user-friendly experience, with additional complexity introduced by steganography or cryptography ideally abstracted away from the user to ensure ease of use.

User experience can be significantly impacted by factors like security and performance, with slow data processing or lack of robustness against attacks compromising the effective management of stego objects and degrading overall user experience.

Robust implementations of steganography and cryptography are crucial for maintaining a high-quality user experience, highlighting the importance of considering user-centric factors in system design and evaluation.

4.5 General observations

In summary, the evaluation of steganography and cryptography requires a comprehensive analysis that encompasses security, performance, and user perspectives. Unfortunately, several studies overlook certain metrics, creating gaps in our understanding of computation time, capacity, key size, and user-friendliness. It is crucial to strike a balance between steganography and cryptography to ensure an optimal user experience, robust security, and efficient performance. Future research should aim to address these oversights and strive for a more comprehensive evaluation framework.

5 Conclusion and future scope

This review examines the state of combined steganography and cryptography applications in journal articles and conference papers, categorized by application and technological domains. While medical applications dominate, IoT and Cloud Computing domains show active research. Real-time constraints and privacy protection are prominent concerns in technological domains. The combined approach provides data security and privacy benefits, but trade-offs and limitations remain. Further research is needed to address these challenges and improve methodologies. The evaluation metrics vary, emphasizing domain-specific knowledge. A comprehensive framework is proposed, incorporating security, performance, and user perspectives. However, there is a notable lack of user testing in the literature, highlighting the need for user-centric system design. This review focused solely on conference papers for RQ1 due to time constraints. Conference papers are valuable sources of the latest findings and innovative practices in the rapidly evolving field of information security, making them relevant not just for RQ1 but also for RQ2 and RQ3. Additionally, the search keywords were limited to "cryptography" and "steganography," but other terms like "encryption" or "data-hiding" may be used. Future research could explore applications in diverse domains such as transportation and energy. Comparative studies could shed light on the advantages of using steganography or cryptography individually in different scenarios. Further investigations into non-image steganographic mediums and the impact of combining steganography and cryptography on end-user experience and acceptance are also warranted.

Data availability

The datasets generated during and/or analyzed during the current study are available from the corresponding author upon reasonable request.






Alissa, K.A., Maray, M., Malibari, A.A., Alazwari, S., Alqahtani, H., Nour, M.K., Al Duhayyim, M.: Optimal deep learning model enabled secure UAV classification for industry. Comput. Mater. Contin. 74 (3), 5349–5367 (2023)

Google Scholar  

Abbas, M.S., Mahdi, S.S., Hussien, S.A.: Security improvement of cloud data using hybrid cryptography and steganography. In: 2020 International Conference on Computer Science and Software Engineering (CSASE), pp. 123–127. IEEE (2020)

Al Abbas, A.A.M., Ibraheem, N.B.: Using DNA In Adynamic Lightweight Algorithm For Stream Cipher In An IoT Application. In: 2022 International Symposium on Multidisciplinary Studies and Innovative Technologies (ISMSIT), pp. 232–240. IEEE (2022)

Al-Ani, Z.K., Zaidan, A.A., Zaidan, B.B., Alanazi, H.: Overview: main fundamentals for steganography. arXiv preprint arXiv:1003.4086 . (2010)

Al-Juaid, N., Gutub, A.: Combining RSA and audio steganography on personal computers for enhancing security. SN Appl. Sci. 1 , 1–11 (2019)

Article   Google Scholar  

Ali, M.H., Al-Alak, S.: Node protection using hiding identity for IPv6 based network. In: 2022 Muthanna International Conference on Engineering Science and Technology (MICEST), pp. 111–117. IEEE (2022)

Alsamaraee, S., Ali, A.S.: A crypto-steganography scheme for IoT applications based on bit interchange and crypto-system. Bull. Electr. Eng. Inf. 11 (6), 3539–3550 (2022)

Anderson, R.J., Petitcolas, F.A.: On the limits of steganography. IEEE J. Sel. Areas Commun. 16 (4), 474–481 (1998)

Anushiadevi, R., Amirtharajan, R.: Design and development of reversible data hiding-homomorphic encryption & rhombus pattern prediction approach. Multimed. Tools Appl. 82 (30), 46269–46292 (2023)

Badhani, S., Muttoo, S.K.: Evading android anti-malware by hiding malicious applications inside images. Int. J. Syst. Assur. Eng. Manag. 9 , 482–493 (2018)

Banga, P.S., Portillo-Dominguez, A.O., Ayala-Rivera, V.: Protecting user credentials against SQL injection through cryptography and image steganography. In: 2022 10th International Conference in Software Engineering Research and Innovation (CONISOFT), pp. 121–130. IEEE (2022)

Bharathi, P., Annam, G., Kandi, J.B., Duggana, V.K., Anjali, T.: Secure file storage using hybrid cryptography. In: 2021 6th International Conference on Communication and Electronics Systems (ICCES), pp. 1–6. IEEE (2021)

Bhardwaj, R.: An improved reversible data hiding method in encrypted domain for E-healthcare. Multimed. Tools Appl. 82 (11), 16151–16171 (2023)

Bhattacharjee, S., Rahim, L.B.A., Watada, J., Roy, A.: Unified GPU technique to boost confidentiality, integrity and trim data loss in big data transmission. IEEE Access 8 , 45477–45495 (2020)

Bhuiyan, M., Picking, R.: A gesture controlled user interface for inclusive design and evaluative study of its usability. J. Softw. Eng. Appl. 4 (09), 513 (2011)

Bokhari, M.U., Shallal, Q.M.: A review on symmetric key encryption techniques in cryptography. Int. J. Comput. Appl. 147 (10), 43 (2016)

Castillo, R.E., Cayabyab, G.T., Castro, P.J.M., Aton, M.R.: Blocksight: a mobile image encryption using advanced encryption standard and least significant bit algorithm. In: Proceedings of the 1st International Conference on Information Science and Systems, pp. 117–121 (2018)

Caviglione, L., Podolski, M., Mazurczyk, W., Ianigro, M.: Covert channels in personal cloud storage services: the case of dropbox. IEEE Trans. Ind. Inf. 13 (4), 1921–1931 (2016)

Cheong, S.N., Ling, H.C., Teh, P.L.: Secure encrypted steganography graphical password scheme for near field communication smartphone access control system. Expert Syst. Appl. 41 (7), 3561–3568 (2014)

Davis, F.D.: Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 13 , 319–340 (1989)

Dhawan, S., Chakraborty, C., Frnda, J., Gupta, R., Rana, A.K., Pani, S.K.: SSII: secured and high-quality steganography using intelligent hybrid optimization algorithms for IoT. IEEE Access 9 , 87563–87578 (2021)

Elhoseny, M., Ramírez-González, G., Abu-Elnasr, O.M., Shawkat, S.A., Arunkumar, N., Farouk, A.: Secure medical data transmission model for IoT-based healthcare systems. IEEE Access 6 , 20596–20608 (2018)

Gamal, S.M., Youssef, S.M., Abdel-Hamid, A.: Secure transmission and repository platform for electronic medical images: case study of retinal fundus in teleophthalmology. In: 2020 International Conference on Computing, Electronics & Communications Engineering (iCCECE), pp. 9–14. IEEE (2020)

Ghuge, S.S., Kumar, N., Savitha, S., & Suraj, V.: Multilayer technique to secure data transfer in private cloud for saas applications. In: 2020 2nd International Conference on Innovative Mechanisms for Industry Applications (ICIMIA), pp. 646–651. IEEE (2020)

Gupta, S., Goyal, A., Bhushan, B.: Information hiding using least significant bit steganography and cryptography. Int. J. Modern Educ. Comput. Sci. 4 (6), 27 (2012)

Gururaja, H.S., Seetha, M., Koundinya, A.K.: Design and performance analysis of secure elliptic curve cryptosystem. Int. J. Adv. Res. Comput. Commun. Eng. 2 (8), 1 (2013)

Haque, M.E., Zobaed, S.M., Islam, M.U., Areef, F.M.: Performance analysis of cryptographic algorithms for selecting better utilization on resource constraint devices. In: 2018 21st International Conference of Computer and Information Technology (ICCIT), pp. 1–6. IEEE (2018)

Sri, P.H., Chary, K.N.: Secure file storage using hybrid cryptography. Int. Res. J. Mod. Eng. Technol. Sci. (2022). https://doi.org/10.56726/IRJMETS32383

Hashim, M.M., Rhaif, S.H., Abdulrazzaq, A.A., Ali, A.H., Taha, M.S.: Based on IoT healthcare application for medical data authentication: Towards a new secure framework using steganography. In: IOP Conference Series: Materials Science and Engineering, vol. 881, no. 1, p. 012120. IOP Publishing (2020)

Heron, S.: Advanced encryption standard (AES). Netw. Secur. 2009 (12), 8–12 (2009)

Hussain, M., Wahab, A.W.A., Batool, I., Arif, M.: Secure password transmission for web applications over internet using cryptography and image steganography. Int. J. Secur. Appl. 9 (2), 179–188 (2015)

Hussein, A.A., Jumah Al-Thahab, O.Q.: Design and simulation a video steganography system by using FFTturbo code methods for copyrights application. Eastern-Euro. J. Enterp. Technol. 2 (9), 104 (2020)

Hussein, S.A., Saleh, A.I., Mostafa, H.E.D.: A new fog based security strategy (FBS 2) for reliable image transmission. J. Ambient Intell. Humaniz. Comput. 11 , 3265–3303 (2020)

Ibaida, A., Khalil, I.: Wavelet-based ECG steganography for protecting patient confidential information in point-of-care systems. IEEE Trans. Biomed. Eng. 60 (12), 3322–3330 (2013)

Ifinedo, P.: Understanding information systems security policy compliance: an integration of the theory of planned behavior and the protection motivation theory. Comput. Secur. 31 (1), 83–95 (2012)

Jain, D.K., Li, Y., Er, M.J., Xin, Q., Gupta, D., Shankar, K.: Enabling unmanned aerial vehicle borne secure communication with classification framework for industry 5.0. IEEE Trans. Ind. Inf. 18 (8), 5477–5484 (2021)

Jankowski, B., Mazurczyk, W., Szczypiorski, K.: PadSteg: introducing inter-protocol steganography. Telecommun. Syst. 52 , 1101–1111 (2013)

Kavitha, V., Sruthi, G.S., Thoshinny, B., Riduvarshini, S.R.: Stagchain–a steganography based application working on a blockchain environment. In: 2022 3rd International Conference on Electronics and Sustainable Communication Systems (ICESC), pp. 674–681. IEEE (2022)

Khan, H.A., Abdulla, R., Selvaperumal, S.K., Bathich, A.: IoT based on secure personal healthcare using RFID technology and steganography. Int. J. Electr. Comput. Eng. 11 (4), 3300 (2021)

Kumar, M.G.V., Ragupathy, U.S.: A survey on current key issues and status in cryptography. In: 2016 International Conference on Wireless Communications, Signal Processing and Networking (WiSPNET), pp. 205–210. IEEE (2016)

Kumar, N., Kalpana, V.: A novel reversible steganography method using dynamic key generation for medical images. Indian J. Sci. Technol. 8 (16), 1 (2015)

Li, C., Xu, C., Gui, C., Fox, M.D.: Distance regularized level set evolution and its application to image segmentation. IEEE Trans. Image Process. 19 (12), 3243–3254 (2010)

Article   MathSciNet   Google Scholar  

Madavi, K.B., Karthick, P.V.: Enhanced cloud security using cryptography and steganography techniques. In: 2021 International Conference on Disruptive Technologies for Multi-Disciplinary Research and Applications (CENTCON), vol. 1, pp. 90–95. IEEE (2021)

Mancy, L., Vigila, S.M.C.: A new diffusion and substitution-based cryptosystem for securing medical image applications. Int. J. Electron. Secur. Digit. Forens. 10 (4), 388–400 (2018)

Mandal, P.C., Mukherjee, I., Paul, G., Chatterji, B.N.: Digital image steganography: a literature survey. Inf. Sci. 609 , 1451–1488 (2022)

Mandal, S., Khan, D.A.: Enhanced-longest common subsequence based novel steganography approach for cloud storage. Multimed. Tools Appl. 82 (5), 7779–7801 (2023)

Manikandan, V.M., Masilamani, V.: Reversible data hiding scheme during encryption using machine learning. Proc. Comput. Sci. 133 , 348–356 (2018)

Mayilsamy, K., Ramachandran, N., Raj, V.S.: An integrated approach for data security in vehicle diagnostics over internet protocol and software update over the air. Comput. Electr. Eng. 71 , 578–593 (2018)

Mishra, R., Bhanodiya, P.: A review on steganography and cryptography. In: 2015 International Conference on Advances in Computer Engineering and Applications, pp. 119–122. IEEE (2015)

Mishra, Z., Acharya, B.: High throughput novel architectures of TEA family for high speed IoT and RFID applications. J. Inf. Secur. Appl. 61 , 102906 (2021)

Mogale, H., Esiefarienrhe, M., Letlonkane, L. Web authentication security using image steganography and AES encryption. In: 2018 International Conference on Intelligent and Innovative Computing Applications (ICONIC), pp. 1–7. IEEE (2018)

More, S.S., Mudrale, A., Raut, S.: Secure transaction system using collective approach of steganography and visual cryptography. In: 2018 International Conference on Smart City and Emerging Technology (ICSCET), pp. 1–6. IEEE (2018)

Mostaghim, M., Boostani, R.: CVC: chaotic visual cryptography to enhance steganography. In: 2014 11th International ISC Conference on Information Security and Cryptology, pp. 44–48. IEEE (2014)

Munoz, P.S., Tran, N., Craig, B., Dezfouli, B., Liu, Y.: Analyzing the resource utilization of AES encryption on IoT devices. In: 2018 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC), pp. 1200–1207. IEEE (2018)

Nadernejad, E., Sharifzadeh, S., Hassanpour, H.: Edge detection techniques: evaluations and comparisons. Appl. Math. Sci. 2 (31), 1507–1520 (2008)

MathSciNet   Google Scholar  

Bremnavas, I., Mohamed, I.R., Shenbagavadivu, N.: Secured medical image transmission through the two dimensional chaotic system. Int. J. Appl. Eng. Res. 10 (17), 38391–38396 (2015)

Neetha, S.S., Bhuvana, J., Suchithra, R.: An efficient image encryption reversible data hiding technique to improve payload and high security in cloud platforms. In: 2023 6th International Conference on Information Systems and Computer Networks (ISCON), pp. 1–6. IEEE (2023)

Nielsen, J., Molich, R.: Heuristic evaluation of user interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 249–256 (1990)

Nissar, A., Mir, A.H.: Classification of steganalysis techniques: a study. Digit. Signal Process. 20 (6), 1758–1770 (2010)

Pande, A., Zambreno, J.: A chaotic encryption scheme for real-time embedded systems: design and implementation. Telecommun. Syst. 52 , 551–561 (2013)

Parah, S.A., Ahad, F., Sheikh, J.A., Bhat, G.M.: Hiding clinical information in medical images: a new high capacity and reversible data hiding technique. J. Biomed. Inform. 66 , 214–230 (2017)

Patil, N., Kondabala, R.: Two-layer secure mechanism for electronic transactions. In: 2022 International Conference on Recent Trends in Microelectronics, Automation, Computing and Communications Systems (ICMACC), pp. 174–181. IEEE (2022)

Perwej, Y., Haq, K., Parwej, F., Mumdouh, M., Hassan, M.: The internet of things (IoT) and its application domains. Int. J. Comput. Appl. 975 (8887), 182 (2019)

Chen, C.P., Zhang, C.Y.: Data-intensive applications, challenges, techniques and technologies: a survey on big data. Inf. Sci. 275 , 314–347 (2014)

Phipps, A., Ouazzane, K., Vassilev, V.: Enhancing cyber security using audio techniques: a public key infrastructure for sound. In: 2020 IEEE 19th International Conference on Trust, Security and Privacy in Computing and Communications (TrustCom), pp. 1428–1436. IEEE (2020)

Pokharana, A., Sharma, S.: Encryption, file splitting and file compression techniques for data security in virtualized environment. In: 2021 Third International Conference on Inventive Research in Computing Applications (ICIRCA), pp. 480–485. IEEE (2021)

Prabu, S., Ganapathy, G.: Steganographic approach to enhance the data security in public cloud. Int. J. Comput. Aided Eng. Technol. 13 (3), 388–408 (2020)

Pradhan, A., Sahu, A.K., Swain, G., Sekhar, K.R.: Performance evaluation parameters of image steganography techniques. In: 2016 International Conference on Research Advances in Integrated Navigation Systems (RAINS), pp. 1–8. IEEE (2016)

Kumar, P., Sharma, V.K.: Information security based on steganography & cryptography techniques: a review. Int. J. 4 (10), 246–250 (2014)

Preethi, P., Prakash, G.: Secure fusion of crypto-stegano based scheme for satellite image application. In: 2021 Asian Conference on Innovation in Technology (ASIANCON), pp. 1–6. IEEE (2021)

Ramamoorthy, U., Loganathan, A.: Analysis of video steganography in military applications on cloud. Int. Arab J. Inf. Technol. 19 (6), 897–903 (2022)

Angel, N.A., Ravindran, D., Vincent, P.D.R., Srinivasan, K., Hu, Y.C.: Recent advances in evolving computing paradigms: cloud, edge, and fog technologies. Sensors 22 (1), 196 (2021)

Reshma, V., Gladwin, S.J., Thiruvenkatesan, C.: Pairing-free CP-ABE based cryptography combined with steganography for multimedia applications. In: 2019 International Conference on Communication and Signal Processing (ICCSP), pp. 0501–0505. IEEE (2019)

Rout, H., Mishra, B.K.: Pros and cons of cryptography, steganography and perturbation techniques. IOSR J. Electron. Commun. Eng. 76 , 81 (2014)

Issac, B., Rura, L., Haldar, M.K.: Implementation and evaluation of steganography based online voting system. Int. J. Electr. Gov. Res. 12 (3), 71–93 (2016)

Ryu, Y.S., Koh, D.H., Ryu, D., Um, D.: Usability evaluation of touchless mouse based on infrared proximity sensing. J. Usability Stud. 7 (1), 31–39 (2011)

Saleh, M.E., Aly, A.A., Omara, F.A.: Data security using cryptography and steganography techniques. Int. J. Adv. Comput. Sci. Appl. 7 (6), 390 (2016)

Sengupta, A., Rathor, M.: Structural obfuscation and crypto-steganography-based secured JPEG compression hardware for medical imaging systems. IEEE Access 8 , 6543–6565 (2020)

Shaji, A., Stephen, M., Sadanandan, S., Sreelakshmi, S., Fasila, K.A.: Phishing site detection and blacklisting using EVCS, steganography based on android application. In: International Conference on Intelligent Data Communication Technologies and Internet of Things (ICICI) 2018, pp. 1384–1390. Springer International Publishing (2019)

Siregar, B., Gunawan, H., Budiman, M.A.: Message security implementation by using a combination of hill cipher method and pixel value differencing method in mozilla thunderbird email client. In: Journal of Physics: Conference Series, vol. 1255, no. 1, p. 012034. IOP Publishing (2019)

Stanescu, D., Stratulat, M., Ciubotaru, B., Chiciudean, D., Cioarga, R., Micea, M.: Embedding data in video stream using steganography. In: 2007 4th International Symposium on Applied Computational Intelligence and Informatics, pp. 241–244. IEEE (2007)

Subhedar, M.S., Mankar, V.H.: Current status and key issues in image steganography: a survey. Comput. Sci. Rev. 13 , 95–113 (2014)

Wang, X., Zhang, J., Schooler, E.M., Ion, M.: Performance evaluation of attribute-based encryption: toward data privacy in the IoT. In: 2014 IEEE International Conference on Communications (ICC), pp. 725–730. IEEE (2014)

Wu, J., Liao, X., Yang, B.: Image encryption using 2D Hénon-Sine map and DNA approach. Signal Process. 153 , 11–23 (2018)

Xiong, L., Shi, Y.: On the privacy-preserving outsourcing scheme of reversible data hiding over encrypted image data in cloud computing. Comput. Mater. Contin. 55 (3), 523 (2018)

Xu, S., Horng, J.H., Chang, C.C., Chang, C.C.: Reversible data hiding with hierarchical block variable length coding for cloud security. IEEE Trans. Dependable Secure Comput. (2022). https://doi.org/10.1109/TDSC.2022.3219843

Yang, Y., Xiao, X., Cai, X., Zhang, W.: A secure and high visual-quality framework for medical images by contrast-enhancement reversible data hiding and homomorphic encryption. IEEE Access 7 , 96900–96911 (2019)

Zhang, L., Hu, X., Rasheed, W., Huang, T., Zhao, C.: An enhanced steganographic code and its application in voice-over-IP steganography. IEEE Access 7 , 97187–97195 (2019)

Zhang, X.G., Yang, G.H., Ren, X.X.: Network steganography based security framework for cyber-physical systems. Inf. Sci. 609 , 963–983 (2022)

Zhou, X., Tang, X.: Research and implementation of RSA algorithm for encryption and decryption. In: Proceedings of 2011 6th International Forum on Strategic Technology, vol. 2, pp. 1118–1121. IEEE (2011)

Sarmah, D.K., Kulkarni, A.J.: JPEG based steganography methods using cohort intelligence with cognitive computing and modified multi random start local search optimization algorithms. Inf. Sci. 430 , 378–396 (2018)

Yang, Y., Newsam, S.: Bag-of-visual-words and spatial extensions for land-use classification. In: Proceedings of the 18th SIGSPATIAL International Conference on Advances in Geographic Information Systems, pp. 270–279 (2010)

AID: A scene classification dataset, https://www.kaggle.com/datasets/jiayuanchengala/aid-scene-classification-datasets . Accessed 29 Feb 2024

Elshoush, H.T., Mahmoud, M.M.: Ameliorating LSB using piecewise linear chaotic map and one-time pad for superlative capacity, imperceptibility and secure audio steganography. IEEE Access 11 , 33354–33380 (2023)

Michaylov, K.D., Sarmah, D.K.: Steganography and steganalysis for digital image enhanced forensic analysis and recommendations. J. Cyber Secur. Technol. (2024). https://doi.org/10.1080/23742917.2024.2304441

Sarmah, D.K., Kulkarni, A.J.: Improved cohort intelligence—a high capacity, swift and secure approach on JPEG image steganography. J. Inf. Secur. Appl. 45 , 90–106 (2019)

Singh, O.P., Singh, A.K., Agrawal, A.K., Zhou, H.: SecDH: security of COVID-19 images based on data hiding with PCA. Comput. Commun. 191 , 368–377 (2022)

Singh, K.N., Baranwal, N., Singh, O.P., Singh, A.K.: SIELNet: 3D chaotic-map-based secure image encryption using customized residual dense spatial network. IEEE Trans. Consumer Electron. (2022). https://doi.org/10.1109/TCE.2022.3227401

Mahto, D.K., Singh, A.K., Singh, K.N., Singh, O.P., Agrawal, A.K.: Robust copyright protection technique with high-embedding capacity for color images. ACM Trans. Multimed. Comput. Commun. Appl. (2023). https://doi.org/10.1145/3580502

Download references

Author information

Authors and affiliations.

SCS/EEMCS, University of Twente, P.O. Box 217, 7500AE, Enschede, Overjissel, The Netherlands

Indy Haverkamp & Dipti K. Sarmah

You can also search for this author in PubMed   Google Scholar


Indy Haverkamp: Conceptualization, Methodology, Validation, Investigation, Formal Analysis, Data Curation, Writing—Original Draft, Visualization, Dipti Kapoor Sarmah: Methodology, Writing—Review & Editing, Visualization, Supervision, Project administration.

Corresponding author

Correspondence to Dipti K. Sarmah .

Ethics declarations

Conflict of interest.

The authors have no competing interests to declare that are relevant to the content of this article.

Human and Animals Participants

Informed consent.

All authors agreed with the content and all gave explicit consent to submit.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Haverkamp, I., Sarmah, D.K. Evaluating the merits and constraints of cryptography-steganography fusion: a systematic analysis. Int. J. Inf. Secur. (2024). https://doi.org/10.1007/s10207-024-00853-9

Download citation

Accepted : 12 April 2024

Published : 05 May 2024

DOI : https://doi.org/10.1007/s10207-024-00853-9

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Image steganography
  • Cryptography
  • Real-world applications
  • Evaluation perspectives


  • Find a journal
  • Publish with us
  • Track your research

Purdue Online Writing Lab Purdue OWL® College of Liberal Arts

Welcome to the Purdue Online Writing Lab

OWL logo

Welcome to the Purdue OWL

This page is brought to you by the OWL at Purdue University. When printing this page, you must include the entire legal notice.

Copyright ©1995-2018 by The Writing Lab & The OWL at Purdue and Purdue University. All rights reserved. This material may not be published, reproduced, broadcast, rewritten, or redistributed without permission. Use of this site constitutes acceptance of our terms and conditions of fair use.

The Online Writing Lab at Purdue University houses writing resources and instructional material, and we provide these as a free service of the Writing Lab at Purdue. Students, members of the community, and users worldwide will find information to assist with many writing projects. Teachers and trainers may use this material for in-class and out-of-class instruction.

The Purdue On-Campus Writing Lab and Purdue Online Writing Lab assist clients in their development as writers—no matter what their skill level—with on-campus consultations, online participation, and community engagement. The Purdue Writing Lab serves the Purdue, West Lafayette, campus and coordinates with local literacy initiatives. The Purdue OWL offers global support through online reference materials and services.

A Message From the Assistant Director of Content Development 

The Purdue OWL® is committed to supporting  students, instructors, and writers by offering a wide range of resources that are developed and revised with them in mind. To do this, the OWL team is always exploring possibilties for a better design, allowing accessibility and user experience to guide our process. As the OWL undergoes some changes, we welcome your feedback and suggestions by email at any time.

Please don't hesitate to contact us via our contact page  if you have any questions or comments.

All the best,

Social Media

Facebook twitter.


  1. (PDF) IEEE Paper

    research papers on data science ieee

  2. Multi-Stage Optimized Machine Learning Framework for Network Intrusion

    research papers on data science ieee

  3. Template Ieee

    research papers on data science ieee

  4. Ieee paper format

    research papers on data science ieee

  5. IEEE Format to write research papers by Maham F'Rajput

    research papers on data science ieee

  6. A student's guide to research

    research papers on data science ieee


  1. Introduction of Research Metrics

  2. DS125: Intro to Data Science, Spring2023, Lecture#2, 23-Jan-2023

  3. Data Science in Solar Energy

  4. Unleashing The Power Of Data Science To Transform Industries

  5. Data Science Paper Publication: IEEE vs Springer

  6. Data science ecosystem


  1. A Deep Dissertion of Data Science: Related Issues and its ...

    This paper illustrates What is Data Science, How it processes, and also its Applications. Section II of this paper consists of the different review regarding data science. Section III of this paper illustrates about the complete process of data science. Section IV describes all the related research issues for data science.

  2. PDF Data Science Methodologies: Current Challenges and Future Approaches

    data science research activities, along the implications of dif-ferent methods for executing industry and business projects. At present, data science is a young field and conveys the impres-Preprint submitted to Big Data Research - Elsevier January 6, 2020 arXiv:2106.07287v2 [cs.LG] 14 Jan 2022

  3. Publications

    Publications. IEEE Talks Big Data - Check out our new Q&A article series with big Data experts!. Call for Papers - Check out the many opportunities to submit your own paper. This is a great way to get published, and to share your research in a leading IEEE magazine! Publications - See the list of various IEEE publications related to big data and analytics here.

  4. Machine Learning: Algorithms, Real-World Applications and Research

    In the current age of the Fourth Industrial Revolution (4IR or Industry 4.0), the digital world has a wealth of data, such as Internet of Things (IoT) data, cybersecurity data, mobile data, business data, social media data, health data, etc. To intelligently analyze these data and develop the corresponding smart and automated applications, the knowledge of artificial intelligence (AI ...

  5. IEEE

    Benefits of IEEE conference proceedings. Breadth of content: papers on topics from grid computing to wireless communications. The latest research: often published before other leading journals. Unequaled depth: 3 million conference papers are available, and more than 1,700 conference proceedings titles are published each year.

  6. A Review of Artificial Intelligence Methods for Data Science and Data

    In the 21st century, emerging fields in computer science are Data Science & Machine Learning. Data Science analyse the given data using statistical analysis and identifies hidden patterns among ...

  7. Home

    IEEE Access, a Multidisciplinary, Open Access Journal. IEEE Access is a multidisciplinary, online-only, gold fully open access journal, continuously presenting the results of original research or development across all IEEE fields of interest. Supported by article processing charges (APCs), its hallmarks are rapid peer review, a submission-to ...

  8. data science Latest Research Papers

    Assessing the effects of fuel energy consumption, foreign direct investment and GDP on CO2 emission: New data science evidence from Europe & Central Asia. Fuel . 10.1016/j.fuel.2021.123098 . 2022 . Vol 314 . pp. 123098. Author (s): Muhammad Mohsin . Sobia Naseem .

  9. Exploring the political pulse of a country using data science tools

    In this paper we illustrate the use of Data Science techniques to analyse complex human communication. In particular, we consider tweets from leaders of political parties as a dynamical proxy to political programmes and ideas. We also study the temporal evolution of their contents as a reaction to specific events. We analyse levels of positive and negative sentiment in the tweets using new ...

  10. Home

    Overview. The International Journal of Data Science and Analytics is a pioneering journal in data science and analytics, publishing original and applied research outcomes. Focuses on fundamental and applied research outcomes in data and analytics theories, technologies and applications. Promotes new scientific and technological approaches for ...

  11. Data Science and Analytics: An Overview from Data-Driven Smart

    This research contributes to the creation of a research vector on the role of data science in central banking. In , the authors provide an overview and tutorial on the data-driven design of intelligent wireless networks. The authors in provide a thorough understanding of computational optimal transport with application to data science.

  12. (PDF) Data Science: the impact of statistics

    In this paper, we substantiate our premise that statistics is one of the most important disciplines to provide tools and methods. to find structure in and to give deeper insight into data, and ...

  13. Data Science Ieee Papers and Projects-2020

    DATA SCIENCE-2020-RESEARCH TECHNOLOGIES IEEE PROJECTS PAPERS . ENGPAPER.COM - IEEE PAPER. CSE ECE EEE IEEE PROJECT. ... The data science tools for research of emigratio n processes in Ukraine free download The process of world globalization, labor, and academic mobility, the visa-free regime with the EU countries have caused a significant ...

  14. [2404.19756] KAN: Kolmogorov-Arnold Networks

    Inspired by the Kolmogorov-Arnold representation theorem, we propose Kolmogorov-Arnold Networks (KANs) as promising alternatives to Multi-Layer Perceptrons (MLPs). While MLPs have fixed activation functions on nodes ("neurons"), KANs have learnable activation functions on edges ("weights"). KANs have no linear weights at all -- every weight parameter is replaced by a univariate function ...

  15. IEEE Data Science projects

    For details, Call: 9886692401/9845166723. DHS Informatics providing latest 2021-2022 IEEE projects on Data science for the final year engineering students. DHS Informatics trains all students to develop their project with good idea what they need to submit in college to get good marks. DHS Informatics offers placement training in Bangalore and ...

  16. [PDF] An Improved Reversible Data Hiding Algorithm Based on

    This paper reconstructed the mapping scheme of the PVO-k algorithm to maximize the number of pixels that can embed encrypted information, and shows that the proposed scheme significantly surpasses previous algorithms in terms of the maximum data embedding capacity. Reversible Data Hiding (RDH) is a practical and efficient technique for information encryption. Among its methods, the Pixel-Value ...

  17. The Research Data Management Organiser (RDMO)

    The CODATA Data Science Journal is a peer-reviewed, open access, electronic journal, publishing papers on the management, dissemination, use and reuse of research data and databases across all research domains, including science, technology, the humanities and the arts. The scope of the journal includes descriptions of data systems, their implementations and their publication, applications ...

  18. IEEE CVMI 2024 || IIIT Allahabad, INDIA || 19-20 October 2024

    The IAPR Best Paper Award and the CVMI-2024 Best Paper Awards will be given to the outstanding papers. The Best PhD Dissertation Awards will also be given in the PhD Symposium during IEEE CVMI 2024. IEEE CVMI 2024 Attractions: Successfully presented papers will be submitted to IEEE Xplore for publication. Sponsored by IEEE Uttar Pradesh Section.

  19. Computer Science Student and Professor at University of Puget Sound Win

    Chiu and Kaeppel described their research in a paper that was accepted for publication at the 10th Association for Computing Machinery (ACM) and Institute of Electrical and Electronics Engineers (IEEE) International Conference on Big Data Computing, Applications, and Technologies, where it won the award for best paper.

  20. Evaluating the merits and constraints of cryptography ...

    In today's interconnected world, safeguarding digital data's confidentiality and security is crucial. Cryptography and steganography are two primary methods used for information security. While these methods have diverse applications, there is ongoing exploration into the potential benefits of merging them. This review focuses on journal articles from 2010 onwards and conference papers from ...

  21. Welcome to the Purdue Online Writing Lab

    Mission. The Purdue On-Campus Writing Lab and Purdue Online Writing Lab assist clients in their development as writers—no matter what their skill level—with on-campus consultations, online participation, and community engagement. The Purdue Writing Lab serves the Purdue, West Lafayette, campus and coordinates with local literacy initiatives.