Disparity Estimation on Stereo Mammograms
Rs3,500.00
10000 in stock
SupportDescription
The problem of finding two points in a stereo image pair that correspond to the same point in the 3D scene is called the correspondence problem. The distance between the two corresponding points when the two images are aligned one on top of the other is called the disparity. The correspondence problem or the disparity estimation problem has been studied extensively by the computer vision community. The motivation there is that the disparity along with the knowledge of the camera geometry can be used to calculate the depth of each point; and depth information is essential, e.g., for collision avoidance in autonomous navigation. Hence the thrust in computer vision is to estimate dense depth maps which means estimating disparity for every pixel in the image. Computationally efficient and cost effective methods for doing this in real time are yet to come. Nor will, dense depth maps lead to compression. But fortunately the human eye is quite good in filling in missing information, so that sparse depth maps can effectively convey the desired stereo cues. Hence in applications where the stereo pairs are meant for human viewing rather than computer ranging, computing sparse depth maps is the key to compression. Existing disparity estimation algorithms are founded on the premise of brightness (or color) constancy. The focus of this paper is a novel computational stereo model that is specifically directed towards estimating a dense disparity map from a pair of stereo mammograms. The singularity index can be configured to detect point mass like structures such as impulses in a 1D signal or curvilinear masses in images, while rejecting step edges. It can also be configured to do the opposite. Also, the system plan to explore better and faster optimization algorithms such as those based on graph cuts for optimizing the proposed model.
Only logged in customers who have purchased this product may leave a review.
Reviews
There are no reviews yet.