We analyze the rise and fall times of Type Ia supernova (SN Ia) light curves discovered by the Sloan Digital Sky Survey-II (SDSS-II) Supernova Survey. From a set of 391 light curves k-corrected to the rest-frame B and V bands, we find a smaller dispersion in the rising portion of the light curve compared to the decline. This is in qualitative agreement with computer models which predict that variations in radioactive nickel yield have less impact on the rise than on the spread of the decline rates. The differences we find in the rise and fall properties suggest that a single "stretch" correction to the light curve phase does not properly model the range of SN Ia light curve shapes. We select a subset of 105 light curves well observed in both rise and fall portions of the light curves and develop a "2-stretch" fit algorithm which estimates the rise and fall times independently. We find the average time from explosion to B-band peak brightness is 17.38 ± 0.17 days, but with a spread of rise times which range from 13 days to 23 days. Our average rise time is shorter than the 19.5 days found in previous studies; this reflects both the different light curve template used and the application of the 2-stretch algorithm. The SDSS-II supernova set and the local SNe Ia with well-observed early light curves show no significant differences in their average rise-time properties. We find that slow-declining events tend to have fast rise times, but that the distribution of rise minus fall time is broad and single peaked. This distribution is in contrast to the bimodality in this parameter that was first suggested by Strovink from an analysis of a small set of local SNe Ia. We divide the SDSS-II sample in half based on the rise minus fall value, tr – tf ≤ 2 days and tr – tf > 2 days, to search for differences in their host galaxy properties and Hubble residuals; we find no difference in host galaxy properties or Hubble residuals in our sample.