276 questions
0
votes
0
answers
32
views
In FPGA, How to connect my IP core input and output with Bram correctly [closed]
I use this HLS code to generate an IP core and bundle port:
void mnist_nn_predict(float* input, float* output) {
#pragma HLS INTERFACE s_axilite port=return bundle=CRTL_BUS
#pragma HLS ...
0
votes
0
answers
34
views
Elastic how to properly reindex a data stream, and keep the data stream name?
I am new to Elastic so I am not very experienced with reindexing data streams.
I need to reindex a specific index in a data stream. I do not want to have to create a new data stream and use that data ...
0
votes
0
answers
57
views
Use VBA to start and stop Excel Data Streamer
Is there any way to connect, start and stop the Excel Data Streamer using VBA ? All Microsoft help shows only how to do this manually. No answers on this forum that I could find.
0
votes
1
answer
33
views
Flink GlobalWindow Trigger only process the trigger event
I have datastream keyby by an event property, that is then passed to a globalwindow, trigged when a specific event comes in, the issue is that when the window is trigged to process the events, it only ...
0
votes
1
answer
141
views
Airbyte Postgres Kafka connection failed to run schema discovery
I'm trying to create a connection in Airbyte 0.61.0 (deployed in Docker containers in a Linux Server) using PostgreSQL (conector version 3.4.10) as source and Kafka (conector version 0.1.10) as ...
0
votes
1
answer
117
views
Datastream - AWS AURORA MySQL to BigQuery
I'm trying to stream/migrate data from AWS aurora mysql to BigQuery. By following this documentation Create an HA VPN gateway to a peer VPN gateway I able to ping private and public subnet vice versa -...
0
votes
1
answer
24
views
What does flink store as a state for SlidingWindows?
When I use Aggregation function over SlidingWindow what does flink store in state?
For example, if I aggregate count over sliding window with size 1 hour, and slide of 5 minutes. Does it mean that ...
3
votes
1
answer
414
views
Is there any way to stream to a parquet file in Ruby?
I am trying to create an archival tool for a Ruby On Rails app.
To this end, I wish to store the data in parquet files, ideally with one parquet file per table per time interval.
However, I do not ...
1
vote
1
answer
96
views
implementation of RoundRobin partitioning in Apache Flink
Hi I would like to implement a RoundRobin implementation for an operator in ApacheFlink, before I continue I would like to preface that I'm well aware that this already is implemented in Flink but ...
0
votes
1
answer
178
views
How do I store a stream of data that is scarcely changing in Apache Flink
Essentially, I have a Flink DataStream which reads from a Kafka topic which rarely changes.
This topic holds records like
userConsumerIdentifier
{
"user_id":1,
"consumer_id": 1
}
...
0
votes
1
answer
47
views
Sliding window pyflink
I am new to PyFlink and I have a kafka stream which has phone_number, host_name and event_time all in string formats. How can I compute number of visits for each pair phone_number, host_name, during ...
0
votes
1
answer
68
views
Best way to send huge data in streaming mode through Python flask application?
I want to create flask application which returns the huge data but the problems is, my api is crashing when it is ask for huge dataset like 6 million records. I want to make my flask application which ...
0
votes
0
answers
99
views
Compute average of past k-values in a data stream with limited memory
I am not looking for particular code, more of general theory of what could be done for the problem.
We have a stream of incoming data (data entries are quite large, thus it is infeasible to store more ...
0
votes
0
answers
81
views
how to export mysql's data to browser with saving memory in golang web application
sometimes maybe data is huge in mysql,so hope to save memory to export data to browser in go web application,I guess application/octet-stream is a good way but don't how to do it.
so how to select ...
0
votes
1
answer
30
views
How to dump clustering results in MOA
Please let me know after using data stream clustering algorithm from MOA like Clustream, how can I dump the clustering Result into csv file, which shows each row of arff data file belong to finally ...
0
votes
0
answers
77
views
Datastream not pulling change data from RDS Postgresql
I followed the on screen instruction to connect datastream from AWS RDS Postgresql to Bigquery.
Enabled logical replication, added
CREATE PUBLICATION [MY_PUBLICATION] FOR ALL TABLES;
SELECT ...
0
votes
0
answers
367
views
Problems with sink on flink
[enter image description here][1] I could do this to try first, this works well this just print the data on another topic I trie to follow the same logic but don get the result
package org.example;
...
0
votes
1
answer
629
views
Elasticsearch: 'x_content_parse_exception' [template] unknown field [lifecycle]
I have elk stack deployed from deviantony/docker-elk version 8.9 .
And i'm trying to create lifecycle with retention policy for my datastream as mentioned in official documentetion
PUT _index_template/...
2
votes
1
answer
873
views
Kafka Consumer with Flask Application
I have Flask application running to serve the API endpoint, in the same application I have to implement also kafka consumer in order to consume events from kafka stream, but because of the kafka ...
0
votes
0
answers
116
views
has anyone tried putting a cell in jupyter 7 that can handle streaming data?
Please note I am asking if this CAN be done NOT how to do it.
I am about to try jupyter 7 and I think it should be able to handle streaming data which updates a cell. So I would set up a stream of ...
0
votes
2
answers
606
views
Flink Data Stream Enrichment: Connecting two data streams with different throughput and multi level keying
I'm farely new to Flink and have wanted to join two data streams of two fictional data sources for a showcase on stateful data streaming.
These Data Streams provide data in the form of the following (...
0
votes
1
answer
689
views
Index historical time-series data into an Elasticsearch data stream - ILM
My use case is the following : I have continuously produced time-series data + one year history. I want to index them into Elasticsearch in such a way that data is deleted after one year (according to ...
2
votes
1
answer
1k
views
Reading Data from the web in chunks using HttpClient C#
I am trying to call the web and receive the data that it send back in chunks. So in other words I am trying to receive from the web and print it, while more data is coming in. But I can not find ...
0
votes
1
answer
385
views
How to stream data from a relational database to GCP (bigquery preferrably) through a REST API?
I have CMS data on a custom-built database I have no access too, I want to basically backfill all data and stream all new data in realtime to bigquery or any gcp storage.
Now I have checked out ...
0
votes
1
answer
122
views
PyFlink window aggregation not triggering
I have a problem that my window aggregation accumulates all results, but do not return it, and my result stream is empty
I suspect it has something to do with windows triggering, but cannot figure out ...
0
votes
1
answer
298
views
Reading from a Kinesis data stream (or any data stream) in two different tumbling time windows
I have a Kinesis data stream, and I am consuming the data using tumbling windows. I have two use cases one is consuming the data in a 5-minute tumbling window, and the other is a 1-minute tumbling ...
0
votes
1
answer
103
views
Uploading real time data on dynamodb using lambda function
we're trying to upload real-time data using a kinesis data generator on aws-dynamodb using the lambda function. Our code works fine while running the test cases but fails to upload data in production.
...
0
votes
2
answers
1k
views
Python How to pass continuous generated data stream between methods
I have a method that continuously generates data and prints them on the console. Let's say something simple like generating random number:
def number_generator():
while True:
random....
0
votes
1
answer
288
views
Properly return data from stream.on()
I basically just want to return the object once the upload is finished, but the message that is send is undefined. How can I change the code to return the data.
index.ts: route to the upload API
app....
1
vote
1
answer
800
views
Google Datastream (mysql to bigquery), no is_deleted field with tables with primary key
I am using datastream in GCP with MySQL source and BQ destination. I have noticed that if the source table has a primary key defined, then the datastream metadata fields do not include the "...
0
votes
1
answer
265
views
UDP/TCP performance in an ideal Ethernet communication
Consider to have a point2point Ethernet connection on which you have to stream data as fast as possible.
The connection is basically ideal, with low probability of losing data, but still some data can ...
2
votes
1
answer
259
views
Cloud data fusion Permission denied due to datastream.connectionProfiles.discover
I am trying to create a cloud data fusion replication job from oracle to bigquery. Receiving the below error.
Failed to connect to the database due to below error :
io.grpc.StatusRuntimeException: ...
0
votes
1
answer
1k
views
How To Handle Client Side Websocket Data Stream in React Native, As The State is Changing Frequently My Screen Are Getting Frozen
I am Acutally Building A Client Side Paper Trading App Using React Native From The Data Provided By The Data Provider using Websocket, Suppose If i Subscribe For 'A' and 'B' and .... , iam getting ...
0
votes
1
answer
182
views
FLINK left join three DataStreams using cogroup
I'm trying to merge three stream into a single stream. Tried union but was unable to proceed as the schema's are different and if I merge the schema it becomes too large.
So, I'm using **cogroup ** ...
1
vote
1
answer
608
views
Firebase Data stream not visible. No data received in past 48 hrs.(React Native)
I've integrated react native firebase analytics with custom events. I see data being present in GA, also in debug view I'm able to view all the events triggered. I'm unable to view the data in data ...
1
vote
1
answer
1k
views
Valid website URL is required in Google Analytics
I try to put the following URL to connect my web application to Google Analytics and it gives the error shown in the screenshot:
I tried changing the domain name and it didn't work either. If anyone ...
0
votes
1
answer
24
views
How do I delete select telemetry data, streams from a Heila?
It is ideal to keep all the data until you can't. When there is a need to delete data, it doesn't necessarily have to be from all the streams. There might be data in some streams that we might want to ...
1
vote
2
answers
2k
views
Streaming and caching tabular data with fsspec, parquet and Pyarrow
I’m trying to stream data from parquet files stored in Dropbox (but it could be somewhere else, S3, gdrive, etc…) and reading in Pandas, while caching it. For that I’m trying to use fsspec for Python
...
1
vote
1
answer
522
views
How to know that an AWS kinesis event has been successfully sent to a client via a lambda function?
I have an architecture where lambda function delivers the events in a kinesis stream to a client. If the event is successfully delivered then the the event should be popped off of the queue in the ...
5
votes
0
answers
2k
views
What are the practical differences between Kafka Topics & Channels?
Conceptually, I get the difference. As per Kafka docs:
[...] a topic is similar to a folder in a filesystem, and the events
are the files in that folder. An example topic name could be
"payments&...
0
votes
1
answer
81
views
Streaming data processing joining with different two latency
We have two transactions, but we need to configure them for future cases. I'm curious about your thoughts on this process. (I'm newbie to streaming data)
We have Flink and KStreams environment.
These ...
1
vote
1
answer
4k
views
Data streaming processing in .net
I'm studying some products to implements event sourcing and in general data streaming processing. In particular I have found Apache Pulsar very good for streaming all my events between many ...
1
vote
1
answer
810
views
alpaca data not streaming
I am trying to stream data by using an Alpaca paper trading account on a Windows 10 terminal, but instead of streaming data it returns empty. Does anybody know what I am doing wrong?
note: I have ...
1
vote
1
answer
191
views
Continous data generator from Azure Databricks to Azure Event Hubs using Spark with Kafka API but no data is streamed
I'm trying to implement a continuous data generator from Databricks to an Event Hub.
My idea was to generate some data in a .csv file and then create a data frame with the data. In a loop I call a ...
1
vote
2
answers
1k
views
How to select kafka topic dynamically in apache flink kafka sink?
I'm using KafkaSink as the sink in my flink application and I require to send stringifiedJSONs to different Kafka topics based on some key-value pairs (for example, a few JSONs go to topic1 and a few ...
1
vote
0
answers
464
views
Stream data from Azure Databricks to Azure Event Hub via Kafka API from CSV file
I am new to Azure Databricks and Event Hubs. I have been struggling for days to stream data from Databricks using Spark and Kafka API to an event hub.The data I want to stream is in a .csv file. The ...
0
votes
1
answer
255
views
How to connect apache storm with grafana?
I am collecting data in JSON format that I process in real time with apache storm. I would now like to use grafana to be able to perform real time visualizations on this processed data. Is there a way ...
0
votes
1
answer
930
views
TcpClient.GetStream() returns null
I have a TcpPeer class that represents the TCP side when converting from another data transfer method (this is what the IPeer interface is for). But for unknown to me reason TcpClient.GetStream() in ...
0
votes
1
answer
317
views
Not able to connect the oracle database to GCP data-stream
I am trying to create a connection profile in GCP data-stream for oracle database. When I try to connect to oracle database, it is showing hostname or port configuration are not correct, even though ...
1
vote
1
answer
2k
views
Kinesis Stream Consumption: LATEST v/s TRIM_HORIZON
I have a use case where I need to keep the Kinesis trigger on my consumer (let's call it Lambda B) be disabled, while the producer (let's call it Lambda A) writes to the Kinesis stream. Once the write ...