Why not. After all, the IMU is currently 'solder-on' so we can try different ones without changing the entire design.
With a little trick (connecting the solder-on boards with small pins) they can also be easily swapped.
All that is required is a tiny PCB with the proper connections.
- Queries
- All Stories
- Search
- Advanced Search
Advanced Search
Feb 23 2017
Feb 20 2017
Feb 9 2017
Jan 22 2017
Dec 2 2016
Oct 30 2016
Oct 27 2016
What about this guy?
Other candidates are:
What is the price?
I'm not sure of what the desirable specifications are for an IMU, but the invensense mpu9250 is pretty easy to use and rather inexpensive. It has a sample rate of up to 32kHz.
Oct 10 2016
Maybe this is the right time and place to suggest NixOS.
It is not used primarily in embedded devices and is also not very enduser friendly right now, but has some advantages like atomic updates (what would help here to easily roll back updates) and declarative configuration.
It is build around the Nix package manager, that can be also used on other Linux distributions and even OS X (theoretically also on Windows, but no one is working on that). They are working on reprodicible builds.
The community is very active and helpful! I use NixOS on most servers at home.
You can find my OS configurations here: https://github.com/davidak/nixos-config
Oct 9 2016
Do you have any open source stacks/designs for handling thunderbolt?
If so, please share with us.
The problem is simple:
- we have 12 LVDS lanes with up the 1.5Gbit of encoded data (officially 1.0Gbit per lane).
- SATA requires 3+ Gbit to be useful, i.e. we would need MGTs to handle that.
- UHS-II can be done with a cheap FPGA
Didn't know that BM decided to make the PCC FOSS/OH ...
We need to record metadata anyway, for example to keep information about exposure time, sensor register settings, etc.
If we decide to do it in a separate file/stream then we need very precise timestamps (or frame numbers) to go with.
Well actually a low priced camera like BMPCC can record 1080p RAW at 30fps which is budget and good enough.
GoPro is a good example. However, 4K at 120 fps, will need a really big amount of money to be able to make an interface fast enough to write to an M.2 SSD, and a fast enough CPU to take the input and store it as RAW file. About $3000 MFT camera like DJI Zenmuse X5R can record in Lossless RAW 4K at max 2.4 Gbps, but it is not open source and for professional use, it would not perform really well in low-light.
Well, an acceleromter info for a camera is a great idea, actually. But the problem is how can it be included in the metadata? Should it be stored in an another format like .aclmif (Accelerometer Info) which contains a very precise tracking of the accelerometer over time. You should include timecode info too so you can parse the information of the accelerometer over time. And the program which can render .aclmif is only Open Cine, with Premiere Pro adding support in 2021? Well but it is not a wrong idea to add a new format and it should not be in this section - it should be in the software section.
What I think is that it would not be practical if we use UHS-II cards because this device is mainly for video production and RAW recording. So if you use a RAID Array of a bunch of UHS-II cards you can get what you want, namely RAW recording enabled. The problem is that a bunch of UHS-II cards would not be practical because you need to bring a bunch of SD Card Reader and hubs to be able to read all the data. If you provide UHS-II cards their price is more than $1/GB now, so it would not be practical for use because SATA SSDs are cheaper and faster, with larger size. With 1 UHS-II card, however, you can record in ProRes 4K or maybe Cineform, I haven't checked for Cineform.
Well what I think is that cooling a camera is not easy. And we need a super conductive material (e.g. Diamond, which does not conduct electricity very well but conducts heat at best, or silver which is cheaper, which conducts both electricity (electric charge) at best and heat really well (2nd place). So really, aluminium is the cheapest option. But we need something cold so we can conduct the heat out of the camera. So I think we should have some sort of heat spreader in which is directly from the CPU, and the heat is conducted out with water, like water cooling so it will definitely evaporate some water and the heat is lost by the water, which is not a good idea, because it is just inefficient. We need to be able to spread the heat, but the handle and the base of the camera and all the buttons of course should not be hot because the camera user will click on them, and the LCDs should not be hot too. So we should be able to spread the heat fast enough that the camera itself is cool enough.
Oct 5 2016
indeed very interesting!
Oct 3 2016
good to hear some news :)
Oct 1 2016
Hi we had speak about to make our own customized handle
Jun 8 2016
May 22 2016
And I think we all agree that its important to clearly version releases and label everything accordingly so any release can be tracked/reproduced.
With rsync we can just have a "latest" repo plus repos with increasing version numbers in case anyone wants to try a specific version or downgrade to test/compare something.
All our plugin modules have an EEPROM on the I2C bus.
Recent Power Boards feature en EEPROM on the I2C bus as well.
The Main Board can be uniquely identified via PICs and MachXO2s.
The fact that the task is sitting there for a year now means that probably nobody is interested enough to work on it.
As you are very interested, maybe you could start working on the conversion.
We now think we know how UHS-II works and where the challenges are.
No hardware tests have been concluded so far because we still need to write software to utilize UHS-II.
I'd be interested to know how you went with UHS-II stuff?
On the topic of firmware management, as you have multiple different images parts and versions, it is really important that you embed as much version information into each part as possible.
I'm very interested in getting a KiCad version of your HDMI beta module. Maybe you could start with attempting to do that?
You might want to take a look at http://libcec.pulse-eight.com/
May 21 2016
Actually (nitpicking here :) the IMU doesn't record the motion, it tracks it with several sensors.
May 20 2016
The IMU just records the motion of the camera (3D rotation, acceleration, etc.) what you do with that kind of information then in post production is up to the user/software.
So i don't understand exactly how it work for the IMU data recording in the beta.
You can take data from IMU and use it directly in postprod for a better stabilization ?
May 15 2016
Specifically test/enable CEC and HPD as well as DDC
there is another one application: realtime track could be very useful for green screen previsualisation of live human in 3d environment or the other side with real environment and vfx character like the troll in lord of the ring (this was made with motioncapture system but today it's sould be possible to do this with camera motion tracking)
For camera stabilisation it could be very cool somebody make an extension card to drive directly the brushless motor.
Maybe information to easily find optical center and find the offset between IMU and optical center could be usefull too
May 14 2016
there is another one application: realtime track could be very useful for green screen previsualisation of live human in 3d environment or the other side with real environment and vfx character like the troll in lord of the ring (this was made with motioncapture system but today it's sould be possible to do this with camera motion tracking)
May 13 2016
The only applications for doing this live in the Beta that comes to mind would be sports/broadcast and here dedicated mechanical stabilization systems exist already... So I agree this is a post production thing mainly, but I think that's what chooksprod meant with mentioning opencine anyway.
While it certainly works well in post processing, this probably isn't a good idea for real-time stabilization in the AXIOM Beta.
May 8 2016
Great !
I found this one, problably you know as well also : http://docs.opencv.org/3.1.0/#gsc.tab=0
May 1 2016
As discussed in the last days lets set up an rsync server and test the firmware update procedure over the internet concept.
Apr 25 2016
Hi everybody,
Mar 29 2016
I like the wood look !
I just start to design a "terminator" cage that we can take everywere. I will stay on a basic thing and with a sketchy design but maybe easy to work for a cnc in wood or metal.
I will put many screws like a cheese when i will be happy of the concept. This simple concept is also working for customized handgrip and to adaptable for the open sam gimbal i think.
Mar 25 2016
Mar 19 2016
The clay is a great idea. An owner won't want to start slapping clay onto the camera so one good way of creating a foundation for doing that would be to emulate what Wooden Camera does. Once form factor specs are final he creates the camera body with wood. It's a gimick essentially, but because the Beta is relatively box shaped this won't be all that difficult for a user to accomplish, and any differentiating between the outer form factor and the inner clay mould could be fine tuned in-house.
Mar 18 2016
looks good!
Mar 17 2016
New design for the MicroZed Heatsink with 40x40x10 fan.
Mar 11 2016
yeah exactly ! a new start up will start a kickstarter campaign for bicycle handgrip.
http://formygrips.com/pages/about-formy-bike
Mar 10 2016
It would be an interesting concept for a service to supply users with a piece of clay to squeeze together with their hand.
Then the clay is hardened and returned to the service provider, who 3d scans it and 3d prints a unique hand grip.
Feb 26 2016
Yeah, black filament have a better look for the beta.
Feb 22 2016
Thanks for printing and the report!
Feb 17 2016
Second step !
Feb 4 2016
Altera MAX 10 doesn't seem to be a good option as far as we checked.
Jan 30 2016
Thanks for the report! This is excellent first hand knowledge!
Jan 29 2016
Hi everybody,
Jan 28 2016
In T212#9262, @chooksprod wrote:
Probably you know this project :
Jan 25 2016
Dec 29 2015
testupdate for troy
Stowing for reference:
https://github.com/x42/libltc/
Dec 17 2015
order placed (lexar and transcend)
need to buy 2 sample uhsII cards from different manufacturers
Oct 18 2015
Oct 17 2015
Sounds great, looking forward to it in anticipation!
Oct 16 2015
Very nice!
Sep 28 2015
But we definitely need a different video to show the process (if we want to) because it is ill advised to promote weapons in public :)
cerakote it is!
Sep 21 2015
Yes, interface board with FPGA is still on the roadmap :)
HDMI Module and outline has been fixed (for the 1x HDMI module).
Aug 31 2015
https://raw.githubusercontent.com/MacroFab/EDALibraries/master/Eagle/ULP/MF_Eagle_Placement.ulp ist für uns eventuell sehr interessant
Was mit einigem Aufwand moeglich waere ist eine Bounding Box um tPlace, bPlace zu legen (oder alternativ um die pads).
Unwahrscheinlich, da Komponenten quasi beliebige Gestalt haben koennen.
Gibt es eine Angabe irgendwo im Zusammenhang mit EAGLE wo man die Dimensionen in mm von den Komponenten auslesen kann? In den pick & place Listen hat man das ja nicht...
Ah ok Herbert! Was thinking about it, because it gives about double the speed... probably something for the future...
AFAIK, CFast is basically SATA, so something different :)
Aug 26 2015
CFast 2.0 is no option?
Jul 31 2015
Jul 28 2015
latest board files: http://vserver.13thfloor.at/Stuff/AXIOM/BETA/?C=M;O=D
Jul 27 2015
Jul 14 2015
added it to the wiki page
Done, I think we have a final two-part solution:
Jul 11 2015
will do!
@sebastian: please check with Andon if they would do a two-part version (no connection between top and bottom block) - like the one we are investigating from Selwyn - and if, at what cost.
Jul 6 2015
Should we consider colour characterization in this? If so, it would be worth considering a light box that enables us to push extremely wide primaries outward as far as the spectral locus permits.