U weet inmiddels dat uw data grote strategische en operationele waarde heeft. En ook dat Data Management nodig is om deze waarde effectief en efficiënt te benutten.
Data Management is primair een taak van mensen. Er zijn echter tal van IT gereedschappen die u daarbij kunnen ondersteunen.
Maar wat doen deze gereedschappen, en welke heeft u nu echt nodig?
Dit artikel beschrijft de 10 belangrijkste vormen van IT-ondersteuning die er voor data management op de markt zijn.
Vaak worden deze als apart product aangeboden of in combinatie met andere functionaliteiten in één software-suite, meestal via de cloud. De bekendste aanbieders van Data Management oplossingen zijn onder meer Tibco, MarkLogic, Oracle, SAP, Informatica, IBM, MicroSoft, SAP, Talend, Amazon en SAS.
Harald van der Weel10 essentiële tools voor effectiever data management
This is the second blog about event driven architecture. As was described in the first blog item, there are several levels on which events occur and are relevant. It is important on which level you want to implement your EDA. In this blog I will concentrate on the application-to-application level and beyond and I will describe a reference architecture that we use at SynTouch.
Event Driven Architecture
Events are the driving force within the event driven architecture. Events are produced, detected, consumed and reacted upon. One of the main principles of EDA is the separation and isolation of the sources and the handling of the event notifications. The advantages and use cases for an EDA are:
Roger van de KimmenadeEvent Driven Architecture – Part 2
De digitalisering is al een aantal jaren gaande. Dit jaar is de digitale transformatie zelfs in een stroomversnelling terechtgekomen: om verspreiding van het Corona-virus tegen te gaan, hebben thuiswerken en online bestellen een boost gekregen. Bedrijven zijn genoodzaakt bestaande traditionele businessmodellen versneld te veranderen. Zo bezorgen restaurants volledige driegangendiners thuis en verzorgen artiesten privéconcerten en optredens via livestreams: dienstverlening steeds meer afgestemd op de klant en diens wensen.
Data increasingly proves value to organizations. Data allows them to improve products, services and processes, reduce risks and gain competitive advantage by being able to predict demand more accurately.
Data Management is the key to optimally profit from the benefits data has to offer. Successful Data Management (DM) delivers the right data at the right stakeholder or process at the right time.
DAMA provides a standardized approach to organize Data Management. DAMA addresses several topics (‘knowledge area’s’) that are separately elaborated in the DAMA/DMBOK (Data Management Body of Knowledge). Think of domains like Data Storage, Data Interoperability, Data-warehousing and Data Security. Each domain describes its own objectives, activities, processes, deliverables and principles – the rules to conduct data management effectively. In total DAMA lists about 150 of those principles.
As this number of rules is hard to live by, we have analyzed all and condensed them into a manageable set of 10 golden rules that together define fundamental critical success factors of DAMA.
Harald van der Weel10 Golden rules for Data Management
Data is the buzzword these days: organizations want to be data driven. They create a lot of data, want to create value out of this data and want the data fast and available 24/7. Therefore the IT architecture needs to be flexible, scalable, resilient, responsive and message (data) driven.
This is where an Event Driven Architecture (EDA) can help and in the next 4 blogs I want to guide you in this event journey.
This first blog will be about events in general and how they can be utilized on all kinds of levels within an organization. In the second blog I will talk about a reference architecture that organizations can use as a starting point. It explains which capabilities are relevant in such an architecture. In the third blog I will describe some principles and guidelines that you can use within an event driven architecture. In the last blog I will explain how an organization can become event driven. I will take you through a maturity model and the steps that you can take to get to a level that your organization needs to reach its business goals.
“Data is het nieuwe goud”, een veel gehoord credo. In tegenstelling tot bij goud hebben wijzelf echter grote invloed op de waarde van onze data. De SynTouch Data Value Chain toont hoe deze waarde stap voor stap gecreëerd kan worden van ruwe “grondstof” tot een kostbaar “bedrijfsjuweel”.#datavaluechain #data
After setting up the environment, it is now time to simulate the beer ratings flowing in. As explained, I will start off several generators simultaneously. To generate some (intended) data skew away from the average, several generators will have the same structural event definition, however they will be different in the combination of users, beers and ratings upper and lower bounds. Of course this is also based on my personal preference – who said my demonstration scenario should be fair?
In this blog, I am going to zoom into KSQL and the opportunities it offers for manipulating streaming data in Kafka, by merely using SQL-like statements. One of the neat things about the Confluent Kafka platform, is that it provides additional utilities on top of the core Kafka tools. One of these utilities is the ksql-datagen, which allows users to generate random data based on a simple schema definition in Apache Avro.
For a long time I have been interested in Apache Kafka and its applications. Unfortunately, forced by circumstances, work and other personal endevours, I had not been able to really dive deeper into the matters until Spring 2019. In April I have finally finished the Udemy course “Apache Kafka for Beginners“.
At work, my exposure to Kafka had only been limited, as we were (ultimately) publishing messages onto a Kafka topic using Oracle Service Bus. However, this was actually a Java-built integration, as we wer just pushing the messages onto a JMS queue, which had a MDB listening that propagated the messages to the Kafka cluster.
After completing the first training I got interested, especially in the role of Kafka in real-time event systems and I decided to take another course on Kafka Streams. I was a bit disappointed that this specific course focussed on the Java development quite heavily, and as an exception I decided to abandon the course uncompleted. During one of the Kafka Meetups, I found out that Confluent was actually offering a very interesting alternative to programming the Kafka Streams API in Java, viz. KSQL.
Charles Darwin zei het al: “It is not the strongest of the species that survives, nor the most intelligent that survives. It is the one that is most adaptable to change”. Zo is het ook voor bedrijven noodzakelijk om (ICT) processen continu te verbeteren om de dienstverlening optimaal te kunnen blijven aansluiten op de steeds sneller veranderende wereld.
Willem de JongDe uitdagingen van de Agile transformatie