Аналіз даних

297
Eлектронна книга

IBM SPSS Modeler Essentials. Effective techniques for building powerful data mining and predictive analytics solutions

Jesus Salcedo, Keith McCormick

IBM SPSS Modeler allows users to quickly and efficiently use predictive analytics and gain insights from your data. With almost 25 years of history, Modeler is the most established and comprehensive Data Mining workbench available. Since it is popular in corporate settings, widely available in university settings, and highly compatible with all the latest technologies, it is the perfect way to start your Data Science and Machine Learning journey. This book takes a detailed, step-by-step approach to introducing data mining using the de facto standard process, CRISP-DM, and Modeler’s easy to learn “visual programming” style. You will learn how to read data into Modeler, assess data quality, prepare your data for modeling, find interesting patterns and relationships within your data, and export your predictions. Using a single case study throughout, this intentionally short and focused book sticks to the essentials. The authors have drawn upon their decades of teaching thousands of new users, to choose those aspects of Modeler that you should learn first, so that you get off to a good start using proven best practices. This book provides an overview of various popular data modeling techniques and presents a detailed case study of how to use CHAID, a decision tree model. Assessing a model’s performance is as important as building it; this book will also show you how to do that. Finally, you will see how you can score new data and export your predictions. By the end of this book, you will have a firm understanding of the basics of data mining and how to effectively use Modeler to build predictive models.

298
Eлектронна книга

Implementing Analytics Solutions Using Microsoft Fabric--DP-600 Exam Study Guide. Boost your skills with expert insights and certification-ready strategies for Microsoft analytics

Jagjeet Singh Makhija, Charles Odunukwe

The DP-600 exam tests your ability to design and implement analytics solutions using Microsoft Fabric, including planning data analytics environments, managing data integration and security, and optimizing performance. Written by two Microsoft specialists with over three decades of combined experience, this book will help you confidently prepare for the DP-600 exam by teaching you the skills that are essential for effectively implementing and designing analytics solutions.You’ll explore data analytics in Microsoft Fabric in detail and understand foundational topics such as data exploration, SQL querying, and data transformation, alongside advanced techniques such as semantic model optimization, performance tuning, and enterprise-scale model design. The book addresses strategic planning, data integration, security, scalability, and the complete project lifecycle, including version control, deployment, and continuous improvement. You’ll also get to grips with practical SQL integration with Microsoft Fabric components, with mock exams to help you reinforce what you’ve learned.By the end of this book, you’ll be able to plan, implement, and optimize analytics solutions using Microsoft Fabric, and you'll be well-equipped with the practical skills needed to tackle real-world data challenges and pass the DP-600 exam.

299
Eлектронна книга

Implementing Oracle API Platform Cloud Service. Design, deploy, and manage your APIs in Oracle’s new API Platform

Andrew Bell, Sander Rensen, Luis Weir, Phil Wilkins

Implementing Oracle API Platform Cloud Service moves from theory to practice using the newest Oracle API management platform. This critical new platform for Oracle developers allows you to interface the complex array of services your clients expect in the modern world.First, you'll learn about Oracle’s new platform and get an overview of it, then you'll see a use case showing the functionality and use of this new platform for Oracle customers. Next, you’ll see the power of Apiary and begin designing your own APIs. From there, you’ll build and run microservices and set up the Oracle API gateways. Moving on, you’ll discover how to customize the developer portal and publish your own APIs. You’ll spend time looking at configuration management on the new platform, and implementing the Oauth 2.0 policy, as well as custom policies. The latest finance modules from Oracle will be examined, with some of the third party alternatives in sight as well.This broad-scoped book completes your journey with a clear examination of how to transition APIs from Oracle API Management 12c to the new Oracle API Platform, so that you can step into the future confidently.

300
Eлектронна книга

Implementing Qlik Sense. Design, Develop, and Validate BI solutions for consultants

Kaushik Solanki, Ganapati Hegde

Qlik Sense is a leading platform for business intelligence (BI) solutions. Qlik Sense helps organizations in making informed decisions based on the data they have.This book will teach you how to effectively use Qlik for optimum customer satisfaction. You will undergo a metamorphosis from a developer to a consultant who is capable of building the most suitable BI solutions for your clients. The book will take you through several business cases – this will give you enough insight to understand the needs of the client clearly and build a BI solution that meets or exceeds their expectations. Starting from the pre-project activities, you will go to the actual execution of the project, the implementation, and even maintenance. This book will give you all the information you need - from the strategy to requirement gathering to implementing BI solutions using Qlik Sense. The book will empower you to take the right decisions in tricky and diffi cult situations while developing analytics and dashboards.

301
Eлектронна книга

Implementing Splunk 7. Effective operational intelligence to transform machine-generated data into valuable business insight - Third Edition

James D. Miller

Splunk is the leading platform that fosters an efficient methodology and delivers ways to search, monitor, and analyze growing amounts of big data. This book will allow you to implement new services and utilize them to quickly and efficiently process machine-generated big data. We introduce you to all the new features, improvements, and offerings of Splunk 7. We cover the new modules of Splunk: Splunk Cloud and the Machine Learning Toolkit to ease data usage. Furthermore, you will learn to use search terms effectively with Boolean and grouping operators. You will learn not only how to modify your search to make your searches fast but also how to use wildcards efficiently. Later you will learn how to use stats to aggregate values, a chart to turn data, and a time chart to show values over time; you'll also work with fields and chart enhancements and learn how to create a data model with faster data model acceleration. Once this is done, you will learn about XML Dashboards, working with apps, building advanced dashboards, configuring and extending Splunk, advanced deployments, and more. Finally, we teach you how to use the Machine Learning Toolkit and best practices and tips to help you implement Splunk services effectively and efficiently. By the end of this book, you will have learned about the Splunk software as a whole and implemented Splunk services in your tasks at projects

302
Eлектронна книга

Instant MapReduce Patterns - Hadoop Essentials How-to. Practical recipes to write your own MapReduce solution patterns for Hadoop programs

Liyanapathirannahelage H Perera

MapReduce is a technology that enables users to process large datasets and Hadoop is an implementation of MapReduce. We are beginning to see more and more data becoming available, and this hides many insights that might hold key to success or failure. However, MapReduce has the ability to analyze this data and write code to process it.Instant MapReduce Patterns – Hadoop Essentials How-to is a concise introduction to Hadoop and programming with MapReduce. It is aimed to get you started and give you an overall feel for programming with Hadoop so that you will have a well-grounded foundation to understand and solve all of your MapReduce problems as needed.Instant MapReduce Patterns – Hadoop Essentials How-to will start with the configuration of Hadoop before moving on to writing simple examples and discussing MapReduce programming patterns.We will start simply by installing Hadoop and writing a word count program. After which, we will deal with the seven styles of MapReduce programs: analytics, set operations, cross correlation, search, graph, Joins, and clustering. For each case, you will learn the pattern and create a representative example program. The book also provides you with additional pointers to further enhance your Hadoop skills.

303
Eлектронна книга
304
Eлектронна книга

Interactive Dashboards and Data Apps with Plotly and Dash. Harness the power of a fully fledged frontend web framework in Python – no JavaScript required

Elias Dabbas

Plotly's Dash framework is a life-saver for Python developers who want to develop complete data apps and interactive dashboards without JavaScript, but you'll need to have the right guide to make sure you’re getting the most of it. With the help of this book, you'll be able to explore the functionalities of Dash for visualizing data in different ways.Interactive Dashboards and Data Apps with Plotly and Dash will first give you an overview of the Dash ecosystem, its main packages, and the third-party packages crucial for structuring and building different parts of your apps. You'll learn how to create a basic Dash app and add different features to it.Next, you’ll integrate controls such as dropdowns, checkboxes, sliders, date pickers, and more in the app and then link them to charts and other outputs. Depending on the data you are visualizing, you'll also add several types of charts, including scatter plots, line plots, bar charts, histograms, and maps, as well as explore the options available for customizing them.By the end of this book, you'll have developed the skills you need to create and deploy an interactive dashboard, handle complexities and code refactoring, and understand the process of improving your application.