Hi guys,
Disclaimer
This is an advice thread. I don't want code it's just about if I'm on the right way or not. I have about 5 years of experience with C# and Unity so the concept of programming is not entirely new to me, however java is.
I am planing to make a dashcam app and did some research about a specific feature. For all others I have a plan to go with.
Description
In case of you having an accident you want to see what happend before and you might not have had your dashcam running. So I want to be able to record a timeframe of 15s and save it as some kind of a stack. So I can just dump more data on top of it and delete the oldest data.
I was looking into the documentation of Android Studio and found something I think could work:
setPreviewCallbackWithBuffer(Camera.PreviewCallback cb)
So as I understood this, I can get a preview image every 1/framerate seconds and send the byte[] to an array which works as the stack I talked about earlier? Combining the images into a video shouldn't be that big of a deal.
Question
To break things down a bit:
I've never worked with video files in the past. So is this something that is at least possible and in the best case can be done the way I described?
Thanks in advance
Disclaimer
This is an advice thread. I don't want code it's just about if I'm on the right way or not. I have about 5 years of experience with C# and Unity so the concept of programming is not entirely new to me, however java is.
I am planing to make a dashcam app and did some research about a specific feature. For all others I have a plan to go with.
Description
In case of you having an accident you want to see what happend before and you might not have had your dashcam running. So I want to be able to record a timeframe of 15s and save it as some kind of a stack. So I can just dump more data on top of it and delete the oldest data.
I was looking into the documentation of Android Studio and found something I think could work:
setPreviewCallbackWithBuffer(Camera.PreviewCallback cb)
So as I understood this, I can get a preview image every 1/framerate seconds and send the byte[] to an array which works as the stack I talked about earlier? Combining the images into a video shouldn't be that big of a deal.
Question
To break things down a bit:
I've never worked with video files in the past. So is this something that is at least possible and in the best case can be done the way I described?
Thanks in advance