I think the way pollsters phrase questions or the order in which certain questions are asked can lead to bias in the results, but does that mean the pollsters are biased? Take a look at some results from today on Real Clear Politics poll collection site: North Carolina Democratic Primary Insider Advantage Obama 48, Clinton 45 Obama +3.0 North Carolina Democratic Primary Zogby Tracking Obama 48, Clinton 40 Obama +8.0 North Carolina Democratic Primary PPP (D) Obama 53, Clinton 43 Obama +10.0 Indiana Democratic Primary InsiderAdvantage Clinton 48, Obama 44 Clinton +4.0 Indiana Democratic Primary Suffolk Clinton 49, Obama 43 Clinton +6.0 Indiana Democratic Primary PPP (D) Clinton 51, Obama 46 Clinton +5.0 Indiana Democratic Primary SurveyUSA Clinton 54, Obama 42 Clinton +12.0 Indiana Democratic Primary Zogby Tracking Clinton 42, Obama 44 Obama +2.0 Democratic Presidential Nomination Gallup Tracking Obama 50, Clinton 45 Obama +5.0 Democratic Presidential Nomination Rasmussen Tracking Obama 46, Clinton 45 Obama +1.0 Democratic Presidential Nomination USA Today/Gallup Obama 44, Clinton 51 Clinton +7.0 Democratic Presidential Nomination CBS News/NY Times Obama 50, Clinton 38 Obama +12.0 General Election: McCain vs. Clinton Rasmussen Tracking Clinton 47, McCain 43 Clinton +4.0 General Election: McCain vs. Clinton USA Today/Gallup Clinton 46, McCain 49 McCain +3.0 General Election: McCain vs. Clinton CBS News/NY Times Clinton 53, McCain 41 Clinton +12.0 The Indiana poll difference between Zogby and Survey USA is really interesting. How can there be a 15 point differential? That spread is too large to be statistically insignificant. Same is true of USA Today vs. CBS/NY Times McCain v Clinton match-up. Again a 15 point spread. The only poll I seem to trust is Rasmussen, which seems to be the closest historically to calling the races correctly within the margin of error.