Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Grasp pixel wise 160x128 images, levine 2016 model, densenet_fcn shrink #395

Merged
merged 13 commits into from
Jan 10, 2018

Conversation

ahundt
Copy link
Member

@ahundt ahundt commented Jan 9, 2018

Implements #388

This implements levine 2016 pixel-wise training. I just fixed a critical bug where yx coordinates may not have been correct during pixel-wise training, so I'm performing a new run on [delta_depth, sin_theta, cos_theta] inputs.

Images are now 128x128, but to train the DenseNetFCN in reasonable time I needed to add an early transition down and transition up to shrink the data size so batches fit on GPU.

I've started two 100 epoch training runs on:

Pixel-wise DenseNetFCN model with early transitions

export CUDA_VISIBLE_DEVICES="1" && python2 grasp_train.py --batch_size=8 --epochs 100 --save_weights densenet_fcn_dataset_062_b_063_072_a_082_b_102_delta_depth_sin_cos_3 --grasp_model grasp_model_segmentation --optimizer SGD

example output filename:
2018-01-09-04-24-48_densenet_fcn_dataset_062_b_063_072_a_082_b_102_delta_depth_sin_cos_3-grasp_model_segmentation-epoch-001.h5

Pixel-wise Levine 2016 model:

export CUDA_VISIBLE_DEVICES="0" && python2 grasp_train.py --batch_size=6 --epochs 100 --save_weights levine_2016_segmentation_dataset_062_b_063_072_a_082_b_102_delta_depth_sin_cos_3 --grasp_model grasp_model_levine_2016_segmentation --optimizer SGD

Filename:
2018-01-09-03-58-56_levine_2016_segmentation_dataset_062_b_063_072_a_082_b_102_delta_depth_sin_cos_3-grasp_model_levine_2016_segmentation-epoch-002.h5

@ahundt ahundt requested review from cpaxton and DingYu95 January 9, 2018 16:32
@ahundt ahundt self-assigned this Jan 10, 2018
@ahundt
Copy link
Member Author

ahundt commented Jan 10, 2018

These models haven't made any progress so I've created #401 and #402 to address the bugs. However, the actual pipeline changes in this PR are still good so it should be merged.

@ahundt ahundt merged commit 55838ec into master Jan 10, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants