I'm working on it. Actually Step 2 is already fully automated.
- Queries
- All Stories
- Search
- Advanced Search
Advanced Search
Nov 2 2017
Like you might have seen on the mailing list, Cineform is now open source.
In general yes. To have a clean system, we should package them. So we can also update them with the systems package manager.
Nov 1 2017
Any progress on this front?
Oct 28 2017
Oct 27 2017
Oct 19 2017
Sep 11 2017
Aug 15 2017
Some recent discourse for posterity:
Aug 14 2017
I began to do this with your help...
Aug 11 2017
anyone up to create the mentioned .gitlab-ci.yml file?
Aug 4 2017
Jul 26 2017
In T697#12042, @jatha wrote:The beta-software is now mirrored to https://gitlab.com/apertus/beta-hardware`
@BAndiT1983 As client side looks rather small I would suggest vanilla javascript.
@maltefiala What about Meteor or Dart for the client side? I'm not that deep into web development jungle and if one searches the web, then thousands of frameworks can be found.
Often there's someone out there who gets into platforms and acquires the name apertus before us.
Jul 25 2017
GitHub answered that the apertus organisation is not dormant but has only no public activity. Maybe we should decide an alternative short handle.
it will be way more work later. also i would like to have apertus as name on github too.
at this point of time this would be some more work. because the gitlab is only a mirror of GitHub, the repo doesn't need any maintenance, so I would vote for keeping the gitlab account a "person".
can we create a group on gitlab to give multiple people permissions like with organizations on github?
@RexOr please don't use apertus-open-source-cinema as name, it's toooooooooo long, nobody wants to type that in! you could use it as display name or description...
The beta-software is now mirrored to https://gitlab.com/apertus/beta-hardware. Wen somebody who's more into building the software manually could write a .gitlab-ci.yml and add it to the GitHub repository the builds would start automatically on every commit.
done
Can you message me your email address please Jatha?
gitlab ci is indeed a good idea :). the mirroring from github is also possible in the public instance but very bugy. to work around this we can create another repository with gitlab ci, which only pulls from github and pushes to gitlab, and is triggered by a github webhook. I did exactly this setup for another project and it works very well...
(https://github.com/freifunkks/site-ffks/tree/beta) the mirror repo can be found here: https://gitlab.com/freifunkks/mirror-scripts
Good idea. Let's wait for @sebastian to create an account I guess. Additionally it seems as if gitlab.com allows the EE feature "repository mirroring": https://docs.gitlab.com/ee/workflow/repository_mirroring.html
@maltefiala GitLab is a good software, so thair CI is probaply also good. i never used it...
What about gitlab-ci? I have an ansible playbook for setup so it would be doable in minutes as long as someone provides a server. We could mirror the github repo and have gitlab as (more or less) transparent tool in the middle.
generally i like https://drone.io/ as a CI. it executes the build in docker containers, so you have a clean invironment every time but it is as easy as Travis. it is open source, you can install it on your own server that has docker (it runs in a docker container itself). they currently don't have a public instance, but will privide paid services in the future and probaply free for open source projects like in the past.
I think travis wouldn't be the optimal solutions for firmware builds, because we need some special tools with proprietary licenses for building the fpga bitstreams and more build time than the 50 minutes travis gives you for free. Imo http://concourse.ci would be a good solution, because it would allow us to create quite modular pipelines for the firmware.
Jul 23 2017
Jul 18 2017
Reference for golang REST service, required for basic testing at the moment: https://thenewstack.io/make-a-restful-json-api-go/
Used JS REST lib (client side): https://github.com/marmelab/restful.js/tree/master
Jul 4 2017
@sebastian I think issues are with the name of cloning dir or I think we should use a recipe for building kernel 4.6.0. Will check it again and create a PR soon.
@pgsamila was so kind and adapt the axiom_beta_build_image_Ubuntu.sh script to use kernel version 4.6 as there are known issues with 4.9 (latest).
Jul 1 2017
Hello! How can I help?
Try running the axiom_beta_build_image_Ubuntu.sh from https://github.com/apertus-open-source-cinema/axiom-beta-qemu in an Ubuntu OS and report the results. The last commented out line in that script starts the created image in qemu.
Hello! How can I help?
Jun 23 2017
commit and updated the scripts to build and run QEMU: https://github.com/apertus-open-source-cinema/axiom-beta-qemu
Jun 22 2017
I committed a first version of the build script to github
Linux Mint 18.1:
Jun 21 2017
Champika would you be so kind an commit all latest files related to getting QEMU setup and running plus readme to https://github.com/apertus-open-source-cinema/axiom-beta-qemu
Bertl on IRC:
http://irc.apertus.org/index.php?day=15&month=06&year=2017#107
qemu-system-aarch64 -M arm-generic-fdt-7series -machine linux=on -m 1024 -serial /dev/null -serial mon:stdio -nographic -hw-dtb devicetree.dtb -kernel u-boot -drive if=sd,format=raw,index=0,file=beta_20170109.dd -boot mode=5
get the devicetree.dtb from the first partition of the image and the u-boot from here: http://vserver.13thfloor.at/Stuff/AXIOM/BETA/u-boot
fixed with latest from https://lab.apertus.org/T737 ?
Updated build script:
http://vserver.13thfloor.at/Stuff/AXIOM/BETA/build_image.sh
May 23 2017
raising to high priority because this is the main blocking task for the entire firmware not getting developed any further
Apr 30 2017
Apr 27 2017
This is really impressive...
Apr 24 2017
Hi KG12-12
Apr 21 2017
Apr 20 2017
Apr 7 2017
See comment at T757.
Source as always here: https://github.com/apertus-open-source-cinema/beta-software
Yesterday i've found what the cause of exception was and will upload real communication part today which encodes settings, sends them to daemon and decodes them. Afterwards the client test project will be converted to a library, shared or static one, but that needs a discussion of needs.
Apr 5 2017
Apr 3 2017
Last time I checked, it was OK after this change:
https://github.com/apertus-open-source-cinema/beta-software/commit/2d2ae95d0a36ae4839610af0c619bc7f7159b42f
Thank you for reviewing my proposal. Sorry I wasn't able to respond earlier, I was busy in a 36 hours long 'Smart India Hackathon', it's over now. I would definitely work upon on your suggestions and submit the final proposal into the GSoC system in few hours.
Apr 2 2017
Some form of highlight recovery would still be useful in this instance though - albeit not one based on white balance.
But I'm assuming in many cases people will be recording a log encoded HDMI stream, rather than raw data - in which case this would help
Just to follow up from Sebastian - yeah, it wouldn't be necessary to do it to Raw data - as it is better handled in post.
As I understand it, the approach generally relies upon approximations from adjacent, unclipped pixels.
In T244#11419, @Bertl wrote:Still doesn't make any sense to me, as there never is any missing data for any channel.
If the main goal is to avoid coloured highlight (i.e. one channel being saturated and thus tinting the bright areas) then there is a simple way which was already suggested several times (only applicable for adjusted live view):
- Select a safe range for all channels (e.g. 0-240 of 255)
- When any channel goes over the limit (240), increase the other channels proportionally till they all meet at 255
Note that this will make any saturated colour white, which might not be the desired outcome, but any guessing has some corner cases.
asked @alex if he could assist.
So, let's look into two example cases:
You're not quite getting it. It would probably easier for you to understand by looking at an example of it in action. If you had a raw data sample in DNG format of a clipped source (such as a domestic light near a white wall) you could load it into Resolve and try turning hightlight recovery on and off - then you'll see the detail appear and disappear in pixels near to the point of clipping (it's often more dramatic with off white lights)
Sounds nice and should be doable on the MicroZed.
It might be worth using decimation and keeping the framerate to avoid extensive buffering.
The obvious drawback is that connectivity via ethernet is lost.
Still doesn't make any sense to me, as there never is any missing data for any channel.
Was this further investigated since?
Very likely, but I guess we have to simply test this on a real system.
@anuditverma,
thanks for your application! I just finished looking at it and am looking forward to receive your final application tomorrow.
@BAndiT1983 What's the status of this? Please add a link to the source, thanks!
@BAndiT1983 What's the status of this? Please add a link to the source, thanks!
Apr 1 2017
@anuditverma,
thanks for ping, will answer tomorrow