Preview (11 of 34 pages)

Chapter 9 Business Intelligence and Big Data Solutions to End of Chapter Material Answers to What Would You Do Questions Your company, which manufactures roller blades and related equipment and clothing, is interested in gaining a better understanding of its customers. While the company has a CRM system that contains information about customer contacts with the firm (phone calls, queries, and orders), there is interest in obtaining more data about what customers think and are saying about your company and its products. What might be some useful sources of additional data? What issues might be associated with collecting, sorting, and analyzing this data? However, students may mention that data exists in a variety of formats. Some data is what computer scientists call structured data—its format is known in advance, and it fits nicely into traditional databases. Most of the data that an organization must deal with is unstructured data, meaning that it is not organized in any predefined manner and does not fit nicely into relational databases. Unstructured data exists in huge quantities and comes from many sources such as text documents, electronic sensors and meters, audio files, email, video clips, surveillance video, phone messages, text messages, instant messages, postings on social media, medical records, x-rays and CRT scans, photos, and illustrations. Many organizations are interested in analyzing unstructured data, often combining it with structured data to make business decisions. Unstructured data can provide rich and fascinating insights. With the right tools, unstructured data can add depth to data analysis that could not be achieved otherwise. Students may cite some real-time examples to highlight the issues associated with collecting, sorting, and analyzing additional data. Additional sources of data for gaining a better understanding of customers could include: Social Media Monitoring: Monitoring social media platforms for mentions of your company, products, and related keywords can provide valuable insights into customer opinions, preferences, and sentiments. Online Reviews and Forums: Analyzing online reviews on e-commerce websites, forums, and community platforms can offer valuable feedback from customers regarding their experiences with your products and services. Customer Surveys: Conducting surveys to gather direct feedback from customers about their satisfaction levels, preferences, and suggestions for improvement can provide quantitative data for analysis. Focus Groups: Organizing focus groups with select customers can allow for in-depth discussions and qualitative insights into their perceptions, needs, and preferences. Customer Feedback Forms: Implementing feedback forms on your website or through email communications can encourage customers to provide their opinions and suggestions directly. Issues associated with collecting, sorting, and analyzing this data may include: Data Quality: Ensuring the accuracy and reliability of data collected from various sources can be challenging, as there may be inaccuracies, biases, or inconsistencies. Data Integration: Integrating data from disparate sources such as CRM systems, social media platforms, and surveys into a cohesive dataset for analysis can require significant effort and technical expertise. Privacy and Compliance: Ensuring compliance with data protection regulations and maintaining customer privacy while collecting and analyzing data is crucial to avoid legal and ethical issues. Data Volume and Variety: Dealing with large volumes of data from multiple sources in different formats can pose challenges in terms of storage, processing, and analysis. Analysis Complexity: Analyzing diverse datasets to extract meaningful insights requires advanced analytical techniques and tools, as well as skilled data analysts or data scientists. Overall, leveraging additional sources of data can provide valuable insights into customer perceptions and behaviors, but it requires careful planning, execution, and analysis to derive actionable insights effectively. Based on the success of the University of Kentucky’s system, your school is considering implementing its own system of data analysis to help increase student retention. You have been selected to participate in a student focus group to provide your thoughts and perspective. The focus group leader briefly explains the goals of the system and how it will work. She then asks for comments regarding the questions and concerns raised by the potential use of such a system. How do you respond? However, students should briefly mention how the data analysis system worked in the university. The University of Kentucky used SAP’s HANA and Business Objects BI software to analyze student retention. A wide range of data was incorporated into the analysis, including high school GPAs, test scores, and student engagement metrics, such as how often the school’s learning management system was used. All this was done in an attempt to learn what helped retain students. Students might agree or disagree to participate in the focus group. Some factors not being considered in this system were the reasons behind a student’s low test scores and less participation in student engagement activities. The reasons could vary from teaching process being mostly one-way learning to problems at home (child abuse), problems with peers (bullying), physical health issues, or mental health concerns (depression). As a participant in the student focus group, I would first express my appreciation for the initiative to improve student retention through data analysis. It's evident that understanding student behavior and needs can greatly enhance the effectiveness of support services and academic interventions. However, I would raise some questions and concerns to ensure that the implementation of this system is both beneficial and ethical: Data Privacy: I would inquire about the measures in place to safeguard student data privacy. It's crucial that any system collecting and analyzing student information complies with relevant privacy laws and regulations. Students should have transparency regarding what data is being collected, how it's being used, and who has access to it. Bias and Fairness: I'd want assurance that the data analysis algorithms are designed to mitigate biases and ensure fairness. Without proper oversight, there's a risk that the system could inadvertently disadvantage certain groups of students, perpetuating inequalities rather than addressing them. Student Input and Consent: It's important that students have a voice in the development and ongoing use of this system. I would advocate for mechanisms that allow students to provide feedback and consent regarding the data collection and analysis processes. Additionally, students should be informed about how their participation in the system may impact their academic experience. Supportive Interventions: While data analysis can identify students who may be at risk of dropping out, it's essential that the interventions implemented are supportive and effective. I would want clarity on what actions will be taken based on the insights gained from the system and how those actions will benefit students. Ethical Considerations: We should also consider the broader ethical implications of using data analytics in higher education. This includes issues such as the potential for surveillance, the impact on academic freedom, and the responsibility to use data ethically and responsibly. Overall, while I recognize the potential benefits of implementing a data analysis system for student retention, I believe it's crucial to address these questions and concerns to ensure that the system is implemented in a way that respects student rights, promotes equity, and ultimately enhances the educational experience for all students. You answer your door to find a political activist who asks you to sign a petition to place a proposition on the ballot that, if approved, would ban the use of data mining that includes any data about the citizens of your state. What would you do? Some might agree to sign the petition, while others may not. Students who would sign up for the ballot might point to the numerous privacy concerns associated with data mining, especially regarding the source of the data and the manner in which the results of the data mining is used. Students who would refuse to sign up might state examples such as that of the U.S. National Security Agency (NSA). The NSA used sophisticated data mining techniques to collect information and analyzed structured and unstructured data for patterns of terrorist or other potential criminal activity. I'd politely thank the activist for their initiative and take a moment to inquire further about the proposition. While I value privacy and understand concerns about data mining, I'd also want to understand the specifics of the proposal and its potential implications. I'd ask questions like: • What exactly does the proposition define as "data mining"? • How would the ban be enforced and regulated? • What potential impacts might this have on businesses, research, and innovation in our state? • Are there any alternative solutions proposed to address privacy concerns while still allowing for beneficial uses of data? After gathering this information, I'd take some time to research and consider the proposition further. I'd weigh the potential benefits of increased privacy against any potential drawbacks or unintended consequences. Ultimately, I'd make an informed decision based on what I believe would be best for the community and the state as a whole. If I felt comfortable with the proposition, I would sign the petition to support its placement on the ballot. If not, I would politely decline to sign and possibly suggest alternative approaches to addressing privacy concerns. Answers to Discussion Questions How would you define business intelligence? Identify and briefly discuss a real-world application of BI that you recently experienced. Business intelligence (BI) includes a wide range of applications, practices, and technologies for the extraction, transformation, integration, analysis, interpretation, and presentation of data to support improved decision making. Business intelligence (BI) refers to the process of gathering, analyzing, and visualizing data to help organizations make informed decisions. It involves the use of various technologies, tools, and methodologies to transform raw data into actionable insights that drive strategic and operational improvements. A real-world application of BI that I recently encountered is in the retail industry. A large chain of supermarkets implemented a BI system to improve their inventory management and optimize sales. By analyzing sales data, customer demographics, and purchasing patterns, the system generated insights into which products were selling well and which were underperforming. These insights allowed the company to adjust their inventory levels, reorder popular items in a timely manner, and plan promotions to boost sales of slower-moving products. Additionally, the BI system helped the supermarket chain identify trends and patterns in customer behavior, such as preferred shopping times and popular product categories. This information was used to tailor marketing campaigns and personalize promotions, enhancing the overall customer experience and increasing customer loyalty. Overall, the implementation of BI in this retail scenario resulted in increased operational efficiency, higher sales revenue, and better customer satisfaction. It showcases the power of BI in leveraging data to drive strategic decision-making and gain a competitive edge in the market. Imagine that you are the sales manager of a large luxury auto dealer. What sort of data would be useful to you in order to contact potential new car customers and invite them to visit your dealership? Where might you get such data? What sort of BI tools might you need to make sense of this data? Some students may say that the sales manager would require data about customers who have shown interest in buying a car from the brand or have the potential to buy one. The manager can identify potential customers by analyzing data derived the automotive brand’s Web site. Some of this data includes customers who used car configurator tools on the website, those who looked for dealers, and those who booked for test drives. The manager might require BI tools such as a data warehouse or data mart to make a better sense of the data. As a sales manager of a luxury auto dealer, acquiring and analyzing relevant data is crucial for targeting potential new car customers effectively. Here's the kind of data that would be useful: Demographic Data: Understanding the demographics of potential customers can help tailor marketing efforts. This includes age, income level, occupation, and location. Psychographic Data: Knowing the lifestyle, interests, and values of potential customers can help in crafting personalized marketing messages that resonate with them. Purchase History: Analyzing past purchases can reveal patterns and preferences, helping predict future buying behavior. Online Behavior: Tracking online behavior such as website visits, social media engagement, and online searches related to luxury cars can provide insights into potential customers' interests. Referral Data: Identifying existing customers who are likely to refer others can be valuable for generating new leads. Competitor Analysis: Understanding what competitors are offering and how they're marketing their products can help in positioning your dealership effectively. Market Trends: Keeping abreast of market trends, such as shifts in consumer preferences or advancements in automotive technology, can inform marketing strategies. Data for these purposes can be sourced from various places: Customer Relationship Management (CRM) Systems: CRM systems store customer information and interactions, providing a rich source of data for analysis. Social Media Platforms: Platforms like Facebook, Twitter, and LinkedIn offer insights into user demographics, interests, and behaviors. Third-Party Data Providers: There are companies that specialize in collecting and selling consumer data, which can be used to augment existing data sources. Website Analytics Tools: Tools like Google Analytics can provide valuable insights into website visitors' behavior and demographics. To make sense of this data and derive actionable insights, you would need Business Intelligence (BI) tools such as: Data Visualization Tools: Tools like Tableau or Power BI can help visualize data in a meaningful way, making it easier to identify trends and patterns. Predictive Analytics Tools: Predictive analytics tools can help forecast future customer behavior based on historical data, enabling more targeted marketing efforts. Customer Segmentation Tools: Tools that segment customers based on various criteria can help tailor marketing messages to different customer groups. Dashboarding Tools: Dashboarding tools provide a snapshot view of key metrics and KPIs, enabling quick decision-making based on real-time data. By leveraging such data and BI tools, you can better understand your potential customers, tailor your marketing efforts, and ultimately drive more sales for your luxury auto dealership. This chapter began with the quote: “The most important goal and potential reward of big data initiatives is the ability to analyze diverse data sources and new data types, not managing very large data sets.” Do you agree with this statement? Why or why not? Some might agree with the statement, while others may not. Students may mention how organizations often employ BI to make predictions about future conditions and then make adjustments in staffing, purchasing, financing, and other operational areas to meet forecasted needs. The data used in BI is often pulled from multiple sources and may be internally or externally generated. Many organizations use this data to build a large collection of data called a data warehouse, or data mart, for use in BI applications. Users, including employees, customers, and authorized suppliers and business partners, can access the data and BI applications via the Web, the Internet, organizational intranets and even via mobile devices such as smartphones and tablets. BI tools frequently operate on data stored in a data warehouse or data mart. Yes, I agree with the statement that the most important goal and potential reward of big data initiatives is the ability to analyze diverse data sources and new data types, rather than solely managing very large datasets. Here's why: 1. Insight Generation: The primary purpose of collecting and analyzing big data is to extract valuable insights and actionable intelligence. Simply amassing vast amounts of data without the ability to effectively analyze it doesn't serve the fundamental goal of understanding and leveraging information. 2. Data Variety: Big data encompasses not just large volumes of data but also diverse types of data, including structured, unstructured, and semi-structured data. Being able to analyze this diverse array of data sources provides richer insights and a more comprehensive understanding of the subject matter. 3. Innovation and Discovery: The real value of big data lies in its potential to uncover patterns, correlations, and trends that might not be immediately obvious. By combining and analyzing diverse data sources, organizations can uncover new insights, innovate, and discover previously unknown opportunities or challenges. 4. Decision Making: Ultimately, the goal of big data initiatives is to empower better decision-making processes. By analyzing diverse data sources and new data types, organizations can make more informed decisions based on a deeper understanding of the factors at play. 5. Competitive Advantage: In today's data-driven world, the ability to effectively analyze diverse data sets can provide a significant competitive advantage. Organizations that can harness the power of big data to gain insights into customer behavior, market trends, or operational efficiency are better positioned to succeed. While managing very large datasets is certainly a challenge that must be addressed in big data initiatives, it is ultimately the analysis of this data that delivers the greatest value. Therefore, prioritizing the ability to analyze diverse data sources and new data types aligns with the overarching goals of big data initiatives. Briefly describe the ETL process for building a data warehouse. Provide two examples of what might happen to the raw data during the data transform step. The data in a data warehouse typically comes from numerous operational systems and external data sources. An extract-transform-load (ETL) process is used to pull data from these disparate data sources to populate and maintain the data warehouse. The extract step in the ETL process is designed to access the various sources of data and pull from each source the data desired to update the data warehouse. During the extract step, the data is also screened for unwanted or erroneous values; data that fails to pass the edits is rejected. In the transform step, the data that will be used to update the data warehouse is edited and, if necessary, converted to a different format. The load step in the ETL process updates the existing data warehouse with the data that have passed through the extract and transform steps. This step creates a new, updated version of the data warehouse. The ETL (Extract, Transform, Load) process is essential for building a data warehouse: 1. Extract: Data is extracted from multiple sources such as databases, spreadsheets, logs, etc., and transferred to a staging area. 2. Transform: In this step, data undergoes various transformations to make it suitable for analysis and storage in the data warehouse. Two examples of transformations include: • Data Cleaning: Removing duplicates, correcting errors, handling missing values, and standardizing formats to ensure data consistency. • Data Aggregation: Summarizing or aggregating data at different levels (e.g., daily, monthly) to facilitate analysis and reporting. 3. Load: Transformed data is loaded into the data warehouse, where it is organized and stored for querying and analysis. During the data transform step, raw data might be modified in various ways to meet the requirements of the data warehouse schema and business needs. For instance: 1. Raw data might be enriched with additional information obtained from external sources to enhance its value for analysis. 2. Data may undergo restructuring or reshaping to conform to the dimensional model of the data warehouse, such as pivoting data from a wide format to a long format for better analysis and reporting. The Internal Revenue Service maintains a large data warehouse containing 10 years of tax return data. Identify and list four other data warehouses kept by other branches of the federal government and the purposes that each one serves. Students may identify and list any data warehouses kept by the various branches of federal government and state the purposes that each one serves. The following are some examples of data warehouses kept by other branches of the federal government: The U.S. Census Bureau contains a wealth of data about U.S. residents. Most of this data is summarized to a household or census tract level. This data is used for various purposes including analysis of proposed government policies. The Bureau of Motor Vehicles contains information about automobiles, their owners, and driving records. The Veterans Administration has data about military personnel, their service records, and their medical history. The Citizen Data Warehouse System (CDWS) developed by the NSA creates a comprehensive database containing detailed information about each U.S. citizen. This data is primarily used to integrate each bank’s credit card processing system with the NSA’s data mining facility in San Antonio. There are four data warehouses kept by other branches of the federal government and their purposes: 1. Centers for Disease Control and Prevention (CDC) Data Warehouse: The CDC maintains a data warehouse that stores vast amounts of health-related data, including information on diseases, outbreaks, public health trends, and demographic statistics. This warehouse serves as a valuable resource for epidemiologists, researchers, policymakers, and healthcare professionals to analyze health trends, track diseases, and formulate public health strategies. 2. Federal Bureau of Investigation (FBI) Criminal Justice Information Services (CJIS) Data Warehouse: The FBI CJIS division manages a data warehouse that stores criminal justice information, including crime statistics, fingerprints, DNA profiles, background checks, and other law enforcement data. This warehouse supports various law enforcement agencies in their investigations, criminal justice research, and the maintenance of national crime databases. 3. National Aeronautics and Space Administration (NASA) Earth Observing System Data and Information System (EOSDIS): EOSDIS is a data warehouse maintained by NASA to archive and distribute Earth science data collected from satellites, aircraft, and ground-based instruments. This warehouse facilitates research in climate science, environmental monitoring, natural disaster management, and various other Earth science disciplines by providing access to a vast array of remote sensing data. 4. Department of Education (ED) Integrated Postsecondary Education Data System (IPEDS) Data Warehouse: The Department of Education maintains the IPEDS data warehouse, which contains comprehensive data on U.S. colleges and universities, including enrollment statistics, graduation rates, financial aid information, institutional characteristics, and academic programs. This warehouse supports education policymakers, researchers, students, and parents in making informed decisions about higher education by providing access to reliable and standardized data on postsecondary institutions. The opening vignette mentions that Amazon has acquired a patent for “anticipatory shipping” that would enable it to ship products to customers even before they go online. What do you think of this concept? What advantages does it offer Amazon? The customer? Amazon’s anticipatory shipping is a system of delivering products to customers before they place an order. This system predicts the customer’s order based on their previous purchase history and their tastes, preferences, and habits. According to the patent, the products will be waiting at the shipper’s hubs or on trucks till the customer places the order. If executed well, this is the next level of predictive analysis which will allow the company to grow and expand its customer base. For Amazon, the primary advantage is that it increases its customers’ satisfaction and their loyalty toward the business organization. It also provides for an opportunity for good word-of-mouth publicity on behalf of the organization. If implemented perfectly, then the primary advantage of this strategy for the customers is the decrease in time of delivery of the product. Another advantage is that it helps customers save time by already adding a list of products that they purchase at frequent intervals. Amazon's concept of "anticipatory shipping" is an innovative strategy aimed at revolutionizing the e-commerce experience. By utilizing predictive analytics and consumer behavior data, Amazon aims to preemptively ship products to distribution centers closer to customers even before they make a purchase. This approach offers several advantages for both Amazon and its customers. Advantages for Amazon: Enhanced Customer Satisfaction: Anticipatory shipping can significantly reduce delivery times, leading to happier customers. By preemptively stocking products near potential buyers, Amazon minimizes the time between order placement and product arrival, potentially improving customer satisfaction and loyalty. Cost Efficiency: Preemptive shipping allows Amazon to optimize its supply chain and distribution network. By strategically placing products in anticipation of demand, the company can reduce transportation costs and optimize inventory management, ultimately increasing operational efficiency and profitability. Competitive Edge: Anticipatory shipping can serve as a significant differentiator for Amazon in the highly competitive e-commerce market. By offering faster delivery times and a more seamless shopping experience, Amazon can attract and retain customers, potentially gaining a competitive edge over rivals. Advantages for Customers: Faster Delivery: Anticipatory shipping means shorter delivery times for customers. By proactively shipping products to distribution centers near potential buyers, Amazon can expedite the shipping process, allowing customers to receive their orders more quickly. Improved Convenience: Faster delivery times translate to greater convenience for customers. With anticipatory shipping, customers can enjoy a more seamless shopping experience, with reduced waiting times between placing an order and receiving their products. Increased Product Availability: Anticipatory shipping increases the likelihood of products being available when customers need them. By stocking items near potential buyers, Amazon can ensure that popular products are readily accessible, reducing the risk of stockouts and improving overall customer satisfaction. Overall, Amazon's anticipatory shipping concept represents a forward-thinking approach to enhancing the e-commerce experience for both the company and its customers. By leveraging predictive analytics and proactive inventory management, Amazon aims to redefine the standards of speed, convenience, and customer satisfaction in online retail. What is the difference between OLAP analysis and drill-down analysis? Provide an example of the effective use of each technique. Online analytical processing (OLAP) is a method to analyze multidimensional data from many different perspectives. It enables users to identify issues and opportunities as well as perform trend analysis. Databases built to support OLAP processing consist of data cubes that contain numeric facts called measures, which are categorized by dimensions such as time and geography. Drill-down analysis is a powerful tool that enables decision makers to gain insight into the details of business data to understand better why something happened. Drill-down analysis involves the interactive examination of high-level summary data in increasing detail to gain insight into certain elements. OLAP (Online Analytical Processing) analysis and drill-down analysis are both techniques used in data analysis, but they differ in their approach and focus. OLAP Analysis: OLAP analysis involves examining multidimensional data from various perspectives. It allows users to analyze data across multiple dimensions such as time, geography, product, and customer segment. OLAP tools typically provide functionalities like slicing, dicing, pivoting, and filtering to interactively explore data. OLAP is suitable for summarizing data and gaining insights into overall trends and patterns. Example: Suppose a retail company wants to analyze its sales performance. Using OLAP analysis, the company can examine sales data across different dimensions such as time (monthly, quarterly), product categories, and regions. This allows them to identify best-selling products, seasonal trends, and regional variations in sales. Drill-Down Analysis: Drill-down analysis involves examining detailed data at progressively deeper levels of granularity. It allows users to navigate from summarized data to more detailed levels of information. Users can drill down from higher-level aggregates to lower-level details to understand the underlying factors contributing to trends or anomalies. Example: Continuing with the retail example, suppose the company notices a decrease in sales revenue for a particular region during a specific quarter. With drill-down analysis, they can drill down into the data to investigate further. They may find that the decline in sales is primarily driven by a decrease in sales of a specific product category or that it is limited to certain stores within the region. This detailed insight enables the company to take targeted actions to address the issue. In summary, OLAP analysis focuses on exploring data from multiple dimensions to gain a holistic view of trends and patterns, while drill-down analysis allows users to delve into detailed data to understand the underlying factors contributing to those trends or anomalies. Both techniques are valuable for data analysis, offering complementary approaches to gaining insights from data. Identify at least four key performance indicators that could be used by the general manager at a large, full-service health club to define the current state of operations, including trainers; workout equipment; indoor and outdoor swimming pools; spa; salon; juice bar; health food restaurant; and basketball, handball, and tennis courts. Sketch what a dashboard displaying those KPIs might look like. Some KPIs that could be used by the general manager at a large, full-service health club are as listed below: Increase the number of customers who have a membership in the health club within the next quarter. Decrease the waiting time for customers, i.e., at the restaurant, spa, salon, juice bar, and workout training with trainers, within the next quarter. Reduce the number of inactive members to less than five percent in the next fiscal year. Increase the number of activities that each member of the club is willing to participate in within the next quarter. here are four key performance indicators (KPIs) that a general manager at a large, full-service health club might use to gauge the current state of operations: Membership Growth Rate: This KPI measures the rate at which new members are joining the health club compared to the rate at which existing members are leaving. It gives insights into the club's attractiveness and its ability to retain members over time. Occupancy Rate: This KPI indicates how efficiently the club's facilities are being utilized. It measures the percentage of available spaces (e.g., workout equipment, courts, spa appointments) that are being used at any given time. A high occupancy rate suggests high demand and effective resource allocation. Customer Satisfaction Score (CSS): Customer satisfaction is crucial for the success of any service-based business. This KPI can be measured through surveys, feedback forms, or online reviews. It reflects members' overall satisfaction with the club's facilities, services, and staff interactions. Revenue per Member: This KPI measures the average revenue generated from each member over a specific period, such as monthly or annually. It helps assess the club's financial performance and the effectiveness of its pricing strategies, upselling techniques, and membership packages. Now, let's sketch out a dashboard displaying these KPIs: This dashboard provides a snapshot of the health club's performance across various areas, allowing the general manager to quickly assess strengths, weaknesses, and areas for improvement. Your nonprofit organization wishes to increase the efficiency of its fundraising efforts. What sort of data might be useful to achieve this goal? How might BI tools be used to analyze this data? If the organization has the data from previous fund drives, it can use the data to identify the small percentage of people who make the largest contributions and focus on its solicitations from those people. It is not uncommon for approximately 80% of the donations to come from roughly 20% of the people who are solicited. To increase the efficiency of fundraising efforts, your nonprofit organization can leverage various types of data to gain insights and make informed decisions. Here's what sort of data might be useful and how business intelligence (BI) tools can analyze it: 1. Donor Data: Collect data on your donors' demographics, giving history, preferred communication channels, and donation patterns. BI tools can analyze this data to identify high-value donors, segment donors based on their characteristics, and personalize fundraising strategies for each segment. 2. Campaign Performance Data: Track the performance of past fundraising campaigns, including metrics such as donation amounts, conversion rates, and return on investment (ROI). BI tools can analyze this data to identify successful fundraising tactics, optimize campaign budgets, and predict the potential outcome of future campaigns. 3. Engagement Metrics: Monitor engagement metrics from your website, email newsletters, social media platforms, and events. BI tools can analyze metrics such as website traffic, email open rates, social media interactions, and event attendance to understand donor behavior and preferences. This insight can inform targeted outreach efforts and improve engagement with donors. 4. Financial Data: Track financial data related to fundraising expenses, revenue generated, and budget allocation. BI tools can provide real-time dashboards and reports to monitor fundraising performance against budget targets, identify areas of overspending or inefficiency, and optimize resource allocation for maximum impact. 5. Trends and External Factors: Monitor external factors such as economic trends, changes in regulations, and competitor activities that may impact fundraising efforts. BI tools can integrate external data sources and perform advanced analytics to identify trends, risks, and opportunities that could influence fundraising strategies. 6. Volunteer Data: If your nonprofit relies on volunteers for fundraising activities, collect data on volunteer participation, skills, and availability. BI tools can analyze volunteer data to optimize volunteer recruitment, training, and deployment, ensuring that fundraising efforts are adequately supported. 7. Feedback and Surveys: Gather feedback from donors, volunteers, and other stakeholders through surveys and feedback forms. BI tools can analyze qualitative data from open-ended responses and sentiment analysis to understand donor preferences, satisfaction levels, and areas for improvement in fundraising strategies. By harnessing the power of data and BI tools, your nonprofit organization can make data-driven decisions to increase the efficiency and effectiveness of its fundraising efforts, ultimately driving greater impact and support for your cause. Must you be a trained statistician to draw valid conclusions from the use of BI tools? Why or why not? Some students might agree that a trained statistician can draw a valid conclusion from the use of BI tools, while others may not. Trained statisticians do have a better way of interpreting the data. They also know how to attain the data for their query about an organization’s growth, optimization of operations, and deliver 100 percent customer satisfaction. However, most of the BI tools today do not need a trained eye as they interpret the results in the easiest way for its users to understand and comprehend. No, you don't necessarily need to be a trained statistician to draw valid conclusions from the use of Business Intelligence (BI) tools, although having a statistical background can certainly be advantageous. BI tools are designed to simplify data analysis and present insights in a user-friendly manner, often through visualizations and dashboards. However, understanding basic statistical concepts such as averages, distributions, correlations, and significance testing can significantly enhance your ability to interpret and draw meaningful conclusions from the data. It allows you to discern patterns, identify trends, and make informed decisions based on the evidence presented by the BI tools. While BI tools can automate many analytical processes, it's still important to have a critical mindset and the ability to interpret the results in context. Without statistical knowledge, there's a risk of misinterpreting the data or drawing flawed conclusions, which could lead to poor decision-making. In summary, while you don't need to be a trained statistician to use BI tools, having a basic understanding of statistics can greatly improve your ability to extract valuable insights and make informed decisions from the data they provide. Action Needed In a highly controversial move, your favorite social network has just agreed to allow Walmart access to the postings, messages, and photos of its users. Walmart will also gain access to user names and email addresses—in violation of the network’s privacy policy. Walmart plans to mine this data to learn more about what its customers want and to develop targeted direct mailings and emails promoting those items. You are so strongly opposed to (or in favor of) this change in the privacy policy that you are motivated to send a message to the social network expressing your opinion. What do you say? Some students might be opposed to this policy, while others may be in favor of it. A concerned user of the social networking site might request the organization to decline Walmart’s access to the private information of its users. It is illegal for the social networking site to do so. And if the networking site does allow Walmart access to its customers’ private information, it might be sued by its users for ignoring its privacy policy. The networking site may even have to pay a lump sum amount as damages to its users. Subject: Urgent: Revoke Walmart's Access to User Data Dear [Social Network Name], I am writing to express my deep concern and outrage regarding the recent decision to grant Walmart access to the personal data of your users. This move blatantly violates the trust and privacy of millions of individuals who rely on your platform to connect with friends and family. Allowing Walmart to access user postings, messages, photos, as well as their names and email addresses, is a flagrant breach of your own privacy policy. Users entrust your platform with their personal information under the expectation that it will be kept secure and used responsibly. By granting such access to a corporate entity, you are betraying that trust and putting users at risk of exploitation. Walmart's intention to mine this data for targeted marketing purposes is deeply troubling. It represents a gross invasion of privacy and a clear disregard for the rights of your users. No individual should have their personal data used in such a manner without explicit consent. I urge you to reconsider this decision immediately and revoke Walmart's access to user data. Uphold your commitment to user privacy and demonstrate that you prioritize the well-being and security of your community over corporate interests. Failure to take action will undoubtedly lead to a mass exodus of users who no longer feel safe or respected on your platform. Do the right thing and protect the privacy of your users before irreparable damage is done. Sincerely, [Your Name] You are the sales manager of a software firm that provides BI software for reporting, query, and business data analysis via OLAP and data mining. Write a paragraph that your sales reps can use when calling on potential customers to help them understand the business benefits of BI. The following is a list of the most frequently mentioned business benefits of BI. BI makes it easy for decision-makers to use and evaluate critical information anytime and anywhere. This enables them to work in ways that better their productivity and affirm the overall business strategy. BI enables employees to access information using familiar, convenient tools powered by a proven, measurable BI platform. When information is easy to relate to, people are better able to interpret and analyze it, resulting in knowledgeable, astute decision making. BI can be used to define strategy, set goals, oversee performance, conduct group analysis, and then make decisions that affirm the overall business strategy. BI empowers people in an organization to interact with data using tools that are recognizable, accessible, and extensively supported, thus decreasing training costs and considerably reducing the learning curve. BI provides abundant scorecard functionality, supported by reports, charts, graphs, and analysis. This means that employees can easily track critical performance indicators (KPIs) against vital business goals. Understanding and evaluating the link between KPIs and a business organization’s corporate goals means that its members gain better awareness of how the business is operating today and not having to delay till the end of the month or quarter, when it is too late to take action and influence performance. Hello, [Potential Customer's Name], I'm reaching out from [Your Company Name], a leading provider of Business Intelligence solutions. In today's competitive landscape, having actionable insights at your fingertips is paramount for driving growth and making informed decisions. Our BI software empowers businesses like yours to harness the full potential of their data through advanced reporting, query capabilities, and sophisticated data analysis via OLAP and data mining techniques. By implementing our solution, you can unlock hidden patterns, trends, and opportunities within your data, enabling you to optimize processes, identify new revenue streams, and enhance overall business performance. With our user-friendly interface and customizable dashboards, you'll gain real-time visibility into key metrics, enabling you to stay agile and proactive in an ever-evolving market. Let's schedule a demo to explore how our BI software can transform your business landscape. You are the new operations manager of a large call center for a multinational retailer. The call center has been in operation for several years, but has failed to meet both the customers’ and senior management’s expectations. You were hired three months ago and challenged to “turn the situation around.” As you are sitting at your desk one day, you get a phone call from your boss asking that you lead a pilot project to implement the use of dashboards in the call center. The goal is to demonstrate the value of dashboards to help monitor and improve the operations in many of the firm’s business units. How do you respond to your boss’s request? Some students may feel that this sounds like a perfect opportunity to demonstrate the value of dashboards to improve an important business function. However, before building a dashboard, it will be critical for the manager to define and gain alignment on what are the key customer and senior management expectations of the call center. These expectations will become the KPI and key measures tracked by the scoreboard. Implementing dashboards in our call center operations can indeed be a game-changer for us in terms of monitoring and improving our performance. Here's how I plan to approach this: Understanding the Objectives: Before diving into the implementation, I'd like to have a detailed discussion with you to understand the specific objectives and key performance indicators (KPIs) that we aim to track through these dashboards. It's essential to align the project goals with the broader objectives of the call center and the company as a whole. Stakeholder Engagement: I'll engage with key stakeholders within the call center, including team leaders, supervisors, and frontline staff, to gather their insights and requirements. Their input is crucial for designing dashboards that are intuitive, actionable, and align with the day-to-day operations. Selecting the Right Tools: Once we have a clear understanding of our objectives and requirements, I'll evaluate different dashboard tools available in the market to identify the most suitable one for our needs. Factors such as scalability, integration capabilities, and user-friendliness will be considered during the selection process. Customization and Development: With the selected tool in hand, I'll work closely with our IT team or external vendors to customize and develop the dashboards according to our specifications. This may involve integrating data from various sources, designing visualizations, and setting up real-time monitoring capabilities. Training and Adoption: Rolling out the dashboards successfully requires effective training and change management initiatives. I'll develop training programs to ensure that all staff members are proficient in using the dashboards and understand their significance in driving performance improvements. Continuous Improvement: Implementing dashboards is just the beginning. I'll establish mechanisms for continuous monitoring and feedback to identify areas for improvement and optimization. Regular reviews with stakeholders will help us refine the dashboards and adapt to evolving business needs. Measurement and Evaluation: Finally, we'll establish metrics to measure the impact of the dashboards on our operations, such as improvements in response times, customer satisfaction scores, and agent productivity. This data will be crucial for demonstrating the value of dashboards to senior management and paving the way for broader adoption across other business units. By following these steps, I'm confident that we can showcase the value of dashboards in enhancing call center operations and contribute to the overall success of the firm's business units. Thank you for entrusting me with this important initiative, and I look forward to delivering results. Web-Based Case Amazon Launches AWS Do research online to identify several users of Amazon AWS BI products and services. What sort of costs and start-up efforts are required to employ Amazon AWS services? How does this compare with the costs of developing this infrastructure in-house? What organizations compete with Amazon in the BI platform arena? What are their costs and relative strengths and weaknesses compared to the Amazon offerings? Students may mention that the costs required to employ Amazon AWS services are minimal compared to the cost of developing the infrastructure in-house. Organizations such as Oracle, Microsoft, SAP, IBM, etc. are Amazon’s competitors in the BI platform arena. Students would be required to understand Amazon’s competitors’ BI products and services to gauge their costs, strength and weaknesses. Amazon Web Services (AWS) is a leading provider of cloud computing services, including various Business Intelligence (BI) products and services. Several users of Amazon AWS BI products and services include: Netflix: Netflix relies heavily on AWS for its data analytics needs, utilizing services like Amazon Redshift for data warehousing and Amazon QuickSight for BI and visualization. Airbnb: Airbnb utilizes AWS for its data processing and analytics, leveraging services like Amazon EMR (Elastic MapReduce) for big data processing and Amazon Athena for querying data stored in Amazon S3. Pinterest: Pinterest uses AWS for its BI and analytics needs, employing services like Amazon Redshift for data warehousing and Amazon Elasticsearch Service for real-time analytics. The costs and start-up efforts required to employ Amazon AWS services vary depending on factors such as the scale of the infrastructure, the specific services utilized, and the complexity of the setup. However, AWS offers a pay-as-you-go pricing model, allowing organizations to pay only for the resources they consume without any upfront costs or long-term commitments. This can significantly reduce the initial investment required compared to building and managing similar infrastructure in-house, where organizations would need to invest in hardware, software licenses, and hiring specialized personnel for maintenance and support. In terms of competition in the BI platform arena, some organizations compete with Amazon AWS, offering similar BI products and services. Some notable competitors include: Microsoft Azure: Microsoft's cloud computing platform, Azure, offers various BI and analytics services, including Azure Synapse Analytics (formerly SQL Data Warehouse) and Power BI. Azure provides similar capabilities to AWS but may appeal more to organizations already invested in the Microsoft ecosystem. Google Cloud Platform (GCP): Google's cloud platform offers BI and analytics services like BigQuery for data warehousing and visualization tools like Data Studio. GCP competes with AWS and Azure, leveraging Google's expertise in data analytics and machine learning. Snowflake: Snowflake is a cloud-based data warehousing platform that competes with Amazon Redshift. Snowflake offers features like instant scalability and separation of storage and compute, which may appeal to organizations looking for flexibility and cost-effectiveness. The costs of these competing platforms can vary depending on factors like usage, storage, and specific features required. Each platform has its strengths and weaknesses, and the choice often depends on factors such as the organization's existing infrastructure, technical requirements, and budget constraints. AWS is known for its extensive range of services, global reach, and strong ecosystem of partners and third-party integrations, making it a popular choice for many organizations. However, Microsoft Azure and Google Cloud Platform also have their respective strengths and may be better suited for certain use cases or environments. Case Study The Big Promise of Big Data in Health Care Discussion Questions What goals was the federal government hoping to achieve by supporting EHR acquisition? Which of those goals are likely to be accomplished? However, students may mention that EHR systems track medical appointments, test results, health provider notes, communications, and other electronic data. The American Recovery and Reinvestment Act of 2009 allocated $40 billion in incentive payments to healthcare providers to encourage them to implement EHR systems. The goal is to move EHR adoption, which stood at a lackluster 30 percent in 2005, to 70 to 90 percent by 2019. EHR systems have the potential to improve efficiency; improve patient access to their medical records; allow healthcare providers and patients to communicate more easily; increase transparency; reduce medical errors; and provide healthcare providers access to an ever-increasing amount of data about patients, medication, diagnosis, and treatments. The enormous pool of healthcare data will help in accomplishing the goal of reducing costs and identifying which treatment plans are most effective. The federal government supported Electronic Health Record (EHR) acquisition with several goals in mind: Improved Patient Care: One of the primary objectives was to enhance patient care by enabling healthcare providers to access comprehensive and up-to-date patient information easily. With EHRs, medical professionals can access a patient's medical history, medications, allergies, and test results swiftly, leading to more informed decision-making and better patient outcomes. Efficiency and Cost Reduction: Another goal was to streamline healthcare processes and reduce costs. EHRs have the potential to automate various administrative tasks, such as billing and scheduling, which can save time and resources for healthcare facilities. Additionally, EHRs can minimize duplication of tests and procedures by providing a centralized repository of patient information, thus reducing unnecessary expenses. Interoperability: The federal government aimed to promote interoperability among different healthcare systems and providers. By encouraging the adoption of standardized EHR systems, the government sought to facilitate the seamless exchange of patient data between different healthcare settings, improving care coordination and continuity. Data-driven Research and Public Health Initiatives: EHRs offer vast amounts of structured health data that can be leveraged for research purposes and public health initiatives. By supporting EHR acquisition, the government aimed to promote data-driven research, epidemiological studies, and the development of evidence-based healthcare practices. Patient Empowerment and Engagement: EHRs have the potential to empower patients by granting them access to their own health information and involving them more actively in their care. Through patient portals and secure messaging systems, individuals can view their medical records, communicate with healthcare providers, and participate in shared decision-making processes. While progress has been made towards achieving these goals, some challenges remain. Interoperability issues persist, hindering seamless data exchange between different EHR systems and healthcare organizations. Concerns about data privacy and security also need to be addressed to ensure patient confidentiality and trust in EHR systems. However, overall, the adoption of EHRs has already led to improvements in patient care, efficiency, and data utilization within the healthcare sector, with the potential for further advancements as technology evolves and standards are refined. What purpose do products such as Watson and Optum One serve? How does this differ from the potential promise of a collaborative research venture such as Optum Labs? However, students should mention that IBM, in 2012, partnered with the Memorial Sloan Kettering Cancer Center (MSKCC) to transform IBM’s cognitive computing technology, called Watson, into an oncologist’s assistant. This technology could diagnose and recommend treatment for cancer patients. IBM supplied Watson with two million pages of medical research papers. MSKCC provided 1.5 million patient records and the expertise of its oncologists. Together, they created a system that uses a patient’s medical information to synthesize the best treatment plan and display the evidence used to create the plan. Optum One, on the other hand, identifies gaps in care along with strategies to avoid patient hospitalization. Students may also mention that United Health teamed up with the Mayo Clinic to establish Optum Labs in 2013. The new research center combined UnitedHealth’s claim’s data from 100 million patients over 20 years with Mayo’s five million clinical records covering 15 years and began mining the data for insights on how to improve healthcare. Before using patient data, Optum Labs first de-identifies it, as required by HIPAA. Any links between the data set and the identity of the contributor are cut to safeguard the privacy of the contributor. Optum Labs also carefully controls data access, including preventing researchers from pulling data from an individual patient. Products like Watson and Optum One serve primarily as healthcare analytics platforms that utilize artificial intelligence and machine learning algorithms to analyze vast amounts of medical data. They aim to improve healthcare outcomes by providing insights into patient care, treatment effectiveness, cost management, and overall operational efficiency. Watson, developed by IBM, is known for its cognitive computing capabilities, which can sift through large datasets, including unstructured data like medical records, research papers, and clinical notes, to extract relevant information and provide recommendations to healthcare professionals. Optum One, on the other hand, is a healthcare analytics and performance platform offered by Optum, a subsidiary of UnitedHealth Group. It integrates data from various sources such as electronic health records, claims data, and patient feedback to provide insights that help healthcare organizations make informed decisions about patient care, population health management, and resource allocation. These products differ from collaborative research ventures like Optum Labs in their focus and approach. While Watson and Optum One are commercial products designed to provide analytics solutions directly to healthcare organizations, Optum Labs is more of a collaborative research initiative. Optum Labs brings together industry partners, academic researchers, and healthcare experts to conduct research and develop innovative solutions to healthcare challenges. It focuses on generating new knowledge, validating hypotheses, and translating research findings into practical applications that can benefit patients, providers, and payers in the healthcare ecosystem. What steps should Optum Labs take to ensure that its research is widely disseminated? Optum Labs combined UnitedHealth’s claim’s data from 100 million patients over 20 years with Mayo’s five million clinical records covering 15 years and began mining the data for insights on how to improve healthcare. UnitedHealth also bought Humedica, a leading data analytics firm, to bring it into the project. Promising to make their research findings public, share their analytical tools, and work collaboratively, Optum Labs issued a call for partners to bring in more data. Drug companies such as Pfizer and Merck, major universities, the American Association of Retired People (AARP), and many others quickly joined the project, giving the center access to vast resources. Optum Labs can take several steps to ensure that its research is widely disseminated: Establish Partnerships: Collaborate with academic institutions, research organizations, and industry partners to amplify the reach of their research findings. Utilize Multiple Channels: Disseminate research through various channels such as peer-reviewed journals, conferences, white papers, online platforms, and social media to reach diverse audiences. Open Access: Embrace open access publishing models to make research freely available to anyone, anywhere, thereby maximizing its accessibility and impact. Engage with Stakeholders: Actively engage with policymakers, healthcare professionals, patient advocacy groups, and other stakeholders to ensure that research findings are relevant and accessible to those who can implement them. Public Relations and Media Outreach: Work with public relations professionals to promote research findings through press releases, media interviews, and articles in mainstream media outlets. Create Educational Resources: Develop educational materials, such as webinars, workshops, and online courses, to disseminate research findings and facilitate knowledge transfer to relevant communities. Community Engagement: Foster partnerships with local communities to ensure that research findings are disseminated and applied effectively at the grassroots level. Translation Services: Provide translations of research findings into multiple languages to reach non-English-speaking audiences and facilitate global dissemination. Data Sharing: Encourage data sharing and collaboration within the research community to facilitate further analysis and validation of findings. Long-Term Engagement Strategy: Develop a long-term engagement strategy to sustain interest and dissemination efforts beyond the initial publication of research findings. By implementing these strategies, Optum Labs can effectively disseminate its research findings to a wide audience and maximize their impact on improving healthcare outcomes. How does Optum Labs protect the privacy of individual patients? Is this sufficient? If not, what else should be done? However, students may mention that Optum Labs first de-identifies patient data before using it, as required by HIPAA. Any links between the data set and the identity of the contributor are cut to safeguard the privacy of the contributor. Optum Labs also carefully controls data access, including preventing researchers from pulling data from an individual patient. Optum Labs prioritizes patient privacy through a variety of measures: De-identification: They anonymize patient data to remove personally identifiable information, such as names and addresses, before analysis. This ensures that individual identities are protected while still allowing for meaningful insights to be gleaned from the data. Data Access Controls: Optum Labs restricts access to patient data to authorized personnel only. This helps prevent unauthorized use or disclosure of sensitive information. Data Encryption: Patient data is often encrypted both in transit and at rest to safeguard against unauthorized access or interception. Compliance with Regulations: They adhere to relevant healthcare privacy regulations such as the Health Insurance Portability and Accountability Act (HIPAA) in the United States, ensuring that their practices align with legal requirements. While these measures are significant steps in safeguarding patient privacy, whether they are sufficient depends on various factors including the sensitivity of the data involved, the potential risks of re-identification, and evolving privacy standards. To enhance privacy protection further, Optum Labs could consider: Implementing Differential Privacy: This involves adding noise or randomness to the data to protect individual privacy while still allowing for accurate analysis at the aggregate level. Enhanced Access Controls: Strengthening access controls and monitoring mechanisms to ensure that only authorized personnel can access and use patient data. Regular Audits and Assessments: Conducting regular audits and privacy impact assessments to identify and mitigate any potential privacy risks or vulnerabilities in their systems and processes. Educating Stakeholders: Providing education and training to employees and collaborators on the importance of patient privacy and best practices for handling sensitive data. By continuously evaluating and improving their privacy protection measures, Optum Labs can strive to maintain the highest standards of patient privacy while still deriving valuable insights from healthcare data. Do you think big data analytics can significantly reduce healthcare costs nationwide? What would the federal government, collaborative ventures like Optum Labs, healthcare providers, healthcare insurers, and patients need to do to make this happen? Some students might think that big data analytics can significantly reduce healthcare costs nationwide, while others may not. Students who answer “Yes” to the first question might list the potential benefits of big data analytics. The supporters would argue that it is vital that the healthcare industry nurture a culture of collaboration for the betterment of all. And many prominent organizations have flocked to join Optum Labs’ collaborative initiatives. A student in favor of big data analytics might state the below example of a collaborative venture like Optum Labs. “Metformin is usually the medication that doctors overwhelmingly prescribe to patients when they are first diagnosed with type 2 diabetes. An Optum Labs study using data from over 37,000 patients found that sulfonylurea drugs also have an equivalent effect on glucose control, quality of life, and longevity. Moreover, sulfonylurea drugs cost less, and patients who use this medication were able to wait longer before starting to take insulin.” Others who answer “No” to the first question might be concerned about their privacy and the exploitation of big data by collaborative business firms, healthcare insurers, and healthcare providers. A student opposed to big data analytics might state that they see big data as a valuable resource that large healthcare companies like UnitedHealth are now vying to control. They argue that holders of critical data, such as clinical pathology laboratories, should consider carefully before providing access to their data to such a big company in the healthcare industry. Big data analytics has the potential to significantly reduce healthcare costs nationwide by providing insights into various aspects of healthcare delivery, such as treatment effectiveness, disease management, resource allocation, and preventive care. Here's how different stakeholders can contribute to realizing this potential: 1. Federal Government: The federal government can play a crucial role in promoting the adoption of big data analytics in healthcare by investing in research, providing funding for pilot projects, and creating regulatory frameworks that support data sharing and interoperability among healthcare providers and insurers. Additionally, the government can incentivize the use of electronic health records (EHRs) and standardized data formats to facilitate data aggregation and analysis. 2. Collaborative Ventures like Optum Labs: Collaborative ventures like Optum Labs can serve as platforms for sharing data, expertise, and best practices among healthcare stakeholders. By bringing together researchers, healthcare providers, insurers, and technology companies, these initiatives can accelerate the development and implementation of data-driven solutions to healthcare challenges. 3. Healthcare Providers: Healthcare providers can leverage big data analytics to improve clinical decision-making, optimize treatment pathways, and identify opportunities for cost savings. By analyzing large volumes of patient data, providers can identify patterns and trends that can inform personalized treatment plans and reduce unnecessary tests, procedures, and hospitalizations. Additionally, providers can use predictive analytics to identify high-risk patients and intervene early to prevent costly complications. 4. Healthcare Insurers: Healthcare insurers can use big data analytics to identify fraud and abuse, manage risk, and design more cost-effective benefit packages. By analyzing claims data, insurers can detect patterns of overutilization, billing errors, and unnecessary procedures, leading to significant cost savings. Insurers can also use predictive modeling to forecast healthcare costs and develop strategies for managing population health more effectively. 5. Patients: Patients can contribute to big data analytics efforts by consenting to the use of their health data for research and analysis purposes. By sharing their medical history, treatment outcomes, and lifestyle information, patients can help researchers and healthcare providers identify trends, risk factors, and treatment strategies that can improve overall healthcare delivery and reduce costs. Additionally, patients can benefit from personalized health recommendations and interventions based on big data analytics, leading to better health outcomes and lower healthcare costs in the long run. Overall, realizing the potential of big data analytics to reduce healthcare costs nationwide will require collaboration and coordination among federal agencies, healthcare stakeholders, technology providers, and patients. By leveraging the power of data and analytics, healthcare can become more efficient, effective, and affordable for everyone. Solution Manual for Information Technology for Managers George W. Reynolds 9781305389830

Document Details

Related Documents

Close

Send listing report

highlight_off

You already reported this listing

The report is private and won't be shared with the owner

rotate_right
Close
rotate_right
Close

Send Message

image
Close

My favorites

image
Close

Application Form

image
Notifications visibility rotate_right Clear all Close close
image
image
arrow_left
arrow_right